-
Princeton University
- Princeton, NJ
-
00:38
(UTC -05:00)
Stars
FULL Augment Code, Claude Code, Cluely, CodeBuddy, Comet, Cursor, Devin AI, Junie, Kiro, Leap.new, Lovable, Manus Agent Tools, NotionAI, Orchids.app, Perplexity, Poke, Qoder, Replit, Same.dev, Trae…
🐙 Guides, papers, lessons, notebooks and resources for prompt engineering, context engineering, RAG, and AI Agents.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
A markup-based typesetting system that is powerful and easy to learn.
Code and documentation to train Stanford's Alpaca models, and generate the data.
DSPy: The framework for programming—not prompting—language models
Development repository for the Triton language and compiler
2021年最新总结,推荐工程师合适读本,计算机科学,软件技术,创业,思想类,数学类,人物传记书籍
Production-tested AI infrastructure tools for efficient AGI development and community-driven innovation
High accuracy RAG for answering questions from scientific documents with citations
[ICLR 2024] Efficient Streaming Language Models with Attention Sinks
A collection of LLM papers, blogs, and projects, with a focus on OpenAI o1 🍓 and reasoning techniques.
My learning notes/codes for ML SYS.
A live stream development of RL tunning for LLM agents
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
SuperCLUE: 中文通用大模型综合性基准 | A Benchmark for Foundation Models in Chinese
AI for Science 论文解读合集(持续更新ing),论文/数据集/教程下载:hyper.ai
A curated list for Efficient Large Language Models
A reading list for large models safety, security, and privacy (including Awesome LLM Security, Safety, etc.).
Official Implementation of Rectified Flow (ICLR2023 Spotlight)
A library for lattice-based multiparty homomorphic encryption in Go
Recent Transformer-based CV and related works.
This project aims to collect the latest "call for reviewers" links from various top CS/ML/AI conferences/journals
TinyChatEngine: On-Device LLM Inference Library
[ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723