Stars
Milvus is a high-performance, cloud-native vector database built for scalable vector ANN search
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
这个资源库包含了为 Prompt 工程手工整理的资源中文清单,重点是GPT、ChatGPT、PaLM 等(自动持续更新)
A BNB Smart Chain client based on the go-ethereum fork
Create beautiful diagrams just by typing notation in plain text.
Visualization for a Retrieval-Augmented Generation (RAG) Assistant 🤖❤️📚
A reference containing Styles and Keywords that you can use with MidJourney AI. There are also pages showing resolution comparison, image weights, and much more!
🥣 AIGC 提示词可视化编辑器 | OPS | Open Prompt Studio
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and…
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
antimatter15 / alpaca.cpp
Forked from ggml-org/llama.cppLocally run an Instruction-Tuned Chat-Style LLM
Instruct-tune LLaMA on consumer hardware
OpenLMLab / OpenChineseLLaMA
Forked from meta-llama/llamaChinese large language model base generated through incremental pre-training on Chinese datasets
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
GPT4All: Run Local LLMs on Any Device. Open-source and available for commercial use.
Terminal UI library with rich, interactive widgets — written in Golang