Stars
This repository showcases various advanced techniques for Retrieval-Augmented Generation (RAG) systems. RAG systems combine information retrieval with generative models to provide accurate and cont…
本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为PyTorch实现。
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
AISystem 主要是指AI系统,包括AI芯片、AI编译器、AI推理和训练框架等AI全栈底层技术
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
Code release for NeRF (Neural Radiance Fields)
YSDA course in Natural Language Processing
Inpaint anything using Segment Anything and inpainting models.
搜集、整理、发布 中文 自然语言处理 语料/数据集,与 有志之士 共同 促进 中文 自然语言处理 的 发展。
AirLLM 70B inference with single 4GB GPU
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
FinRobot: An Open-Source AI Agent Platform for Financial Analysis using LLMs 🚀 🚀 🚀
骆驼(Luotuo): Open Sourced Chinese Language Models. Developed by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子昂 @ 商汤科技
HuggingLLM, Hugging Future.
⭐️ NLP Algorithms with transformers lib. Supporting Text-Classification, Text-Generation, Information-Extraction, Text-Matching, RLHF, SFT etc.
This repository contains various advanced techniques for Retrieval-Augmented Generation (RAG) systems.
An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
Awesome-pytorch-list 翻译工作进行中......
A Toolbox for Adversarial Robustness Research
Early stopping for PyTorch
Codes for "Chameleon: Plug-and-Play Compositional Reasoning with Large Language Models".
Low latency JSON generation using LLMs ⚡️
Implementation of all RAG techniques in a simpler way(以简单的方式实现所有 RAG 技术)
A Structured Output Framework for LLM Outputs
唐诗,藏头诗,按需自动生成古诗,基于Keras、LSTM-RNN。文档齐全。
a bot that generates realistic replies using a combination of pretrained GPT-2 and BERT models
torchtext使用总结,从零开始逐步实现了torchtext文本预处理过程,包括截断补长,词表构建,使用预训练词向量,构建可用于PyTorch的可迭代数据等步骤。并结合Pytorch实现LSTM.