Stars
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
A list of microgrant programs for your good ideas
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Benchmarking tool for assessing LLM models' performance across different hardwares
An multi-column SQL inspired vector DB with implicit embeddings
Development repository for the Triton language and compiler
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
Follow-up work to "Simple Synthetic Data Reduces Sycophancy in Large Language models" by Wei et al., 2023: https://arxiv.org/abs/2308.03958
Easily generate synthetic data for classification tasks using LLMs
Repilot, a patch generation tool introduced in the ESEC/FSE'23 paper "Copiloting the Copilots: Fusing Large Language Models with Completion Engines for Automated Program Repair"
irresponsible innovation. Try now at https://chat.dev/
notebooks for workshop on "understanding neural networks: from basics to reverse engineering them" at codeday lucknow
Identifying Circuit behind Pronoun Prediction in GPT-2 Small
The Incredible PyTorch: a curated list of tutorials, papers, projects, communities and more relating to PyTorch.