-
LTG, UiO
- Oslo, Norway
- https://www.mn.uio.no/ifi/english/people/aca/davisamu/index.html
- @davidsamuelcz
Stars
Minimal and highly hackable implementation of Looped Transformers with GPT
Minimal Claude Code alternative. Single Python file, zero dependencies, ~250 lines.
A highly compressive and high-quality neural audio codec for speech models.
A Norwegian Language Understanding and Generation Evaluation Benchmark
Fast & memory efficient hashtable based on robin hood hashing for C++11/14/17/20
Trully flash implementation of DeBERTa disentangled attention mechanism.
supporting pytorch FSDP for optimizers
Teaching transformers to play chess
Official implementation of "GPT or BERT: why not both?"
Official implementation of "BERTs are Generative In-Context Learners"
Implementation of the paper "Compositional Generalization with Grounded Language Models", ACL 2024 Findings
Highlight errors in a bib file: missing URLs, capitalization protection, etc
Scripts and documentation on scaling large language model training on the LUMI supercomputer
Enabling easy statistical significance testing for deep neural networks.
Implementation of the Adan (ADAptive Nesterov momentum algorithm) Optimizer in Pytorch