-
IMPRS-IS, ELLIS
- Germany
- @rpatrik96
- https://path2phd.substack.com/
- https://rpatrik96.github.io
Stars
Zotero MCP: Connects your Zotero research library with Claude and other AI assistants via the Model Context Protocol to discuss papers, get summaries, analyze citations, and more.
Garmin strength training fit files generator to by-pass 50 steps limit.
Several maximum likelihood ICA algorithms, including Picard
Shapley Interactions and Shapley Values for Machine Learning
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
Multifactorial modeling of response to checkpoint inhibitor immunotherapy from tumor, immune, and clinical features
Reliable, minimal and scalable library for pretraining foundation and world models
A general-purpose, deep learning-first library for constrained optimization in PyTorch
Internal Causal Mechanisms Robustly Predict Language Model Out-of-Distribution Behaviors
Tool for finding minimal pairs given a corpus of words
Compositional Capabilities of Autoregressive Transformers: A Study on Synthetic, Interpretable Tasks
[NeurIPS 2023] Learning Transformer Programs
Experimental modifications to the SAE paradigm to integrate causality into the feature identification process.
Representational similarity consistency across dataset and their driving factors.
SONAR, a new multilingual and multimodal fixed-size sentence embedding space, with a full suite of speech and text encoders and decoders.
Neural Networks and the Chomsky Hierarchy
VISSL is FAIR's library of extensible, modular and scalable components for SOTA Self-Supervised Learning with images.
The largest collection of PyTorch image encoders / backbones. Including train, eval, inference, export scripts, and pretrained weights -- ResNet, ResNeXT, EfficientNet, NFNet, Vision Transformer (V…
OpenMMLab Self-Supervised Learning Toolbox and Benchmark
An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"
Python tool for converting files and office documents to Markdown.
Large Concept Models: Language modeling in a sentence representation space