- Paris
-
16:24
(UTC +02:00) - alephpi.github.io
Highlights
- Pro
rwkv
Implementation of the RWKV language model in pure WebGPU/Rust.
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, …
The nanoGPT-style implementation of RWKV Language Model - an RNN with GPT-level LLM performance.
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
RWKV-LM-V7(https://github.com/BlinkDL/RWKV-LM) Under Lightning Framework