Stars
HealthCards: Exploring Text-to-Image Generation as Visual Aids for Healthcare Knowledge Democratizing and Education
Official repository for the paper: Can Large Language Models Capture Human Annotator Disagreements?
Implementation of "BitNet: Scaling 1-bit Transformers for Large Language Models" in pytorch
Python library to work with ConceptNet offline without the need for PostgreSQL
绝区零 | ZenlessZoneZero | 零号空洞 | 自动战斗 | 自动化 | 图片分类 | OCR识别
STREET: a multi-task and multi-step reasoning dataset
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
Benchmarking large language models' complex reasoning ability with chain-of-thought prompting
Implementation of the paper "Multi-Agent Exploration via Self-Learning and Social Learning"
Implementation of the paper "WToE: Learning When to Explore in Multi-Agent Reinforcement Learning"
📖 Paper reading list in conversational AI.
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Code for CRATE (Coding RAte reduction TransformEr).
A library for mechanistic interpretability of GPT-style language models
Official Code for Paper: RecurrentGPT: Interactive Generation of (Arbitrarily) Long Text
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
Fastest random walks generator on networkx graphs
Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)
The hub for EleutherAI's work on interpretability and learning dynamics
Instruct-tune LLaMA on consumer hardware
Code and documentation to train Stanford's Alpaca models, and generate the data.
Open Academic Research on Improving LLaMA to SOTA LLM
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
The source code and the data for ACL 2022 paper "Show Me More Details: Discovering Hierarchies of Procedures from Semi-structured Web Data"
"One cannot conceive anything so strange and so implausible that it has not already been said by one philosopher or another." ― René Descartes
A Pythonic wrapper for the Wikipedia API