Highlights
- Pro
Stars
Evaluate interpretability methods on localizing and disentangling concepts in LLMs.
Code for the experiments and websites of the paper "Same Task, Different Circuits"
Code for "ResiDual Transformer Alignment with Spectral Decomposition", TMLR 2025
XL-VLMs: General Repository for eXplainable Large Vision Language Models
[COLM 2025] Open-Qwen2VL: Compute-Efficient Pre-Training of Fully-Open Multimodal LLMs on Academic Resources
A Python package for analyzing and transforming neural latent spaces.
A Python library for Secure and Explainable Machine Learning
Fast, differentiable sorting and ranking in PyTorch
An unofficial implementation of "Learning-based Video Motion Magnification" in Pytorch.
official implementation of "Interpreting CLIP's Image Representation via Text-Based Decomposition"
Optimizing Ariborne Energy Production with Reinforcement Learning
Generic template to bootstrap your Python project.
Collect some papers about transformer with vision. Awesome Transformer with Computer Vision (CV)
[NeurIPS 2022] Zero-Shot Video Question Answering via Frozen Bidirectional Language Models
Escaping the Big Data Paradigm with Compact Transformers, 2021 (Train your Vision Transformers in 30 mins on CIFAR-10 with a single GPU!)
Code and data for the paper "Emergent Visual-Semantic Hierarchies in Image-Text Representations" (ECCV 2024)
Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.
Official codebase for Decision Transformer: Reinforcement Learning via Sequence Modeling.
Open-Vocabulary Video Question Answering: A New Benchmark for Evaluating the Generalizability of Video Question Answering Models (ICCV 2023)
find clusters of different intrinsic dimension
Implementation of Hinton's forward-forward (FF) algorithm - an alternative to back-propagation
This is the official GitHub for paper: On the Versatile Uses of Partial Distance Correlation in Deep Learning, in ECCV 2022
Code for "Unifying Grokking and Double Descent" from the NeurIPS 2022 ML Safety Workshop.