💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
-
Updated
Oct 16, 2025 - Rust
💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
⚡ Python-free Rust inference server — OpenAI-API compatible. GGUF + SafeTensors, hot model swap, auto-discovery, single binary. FREE now, FREE forever.
Production-Ready LLM Agent SDK for Every Developer
AthenaOS is a next generation AI-native operating system managed by Swarms of AI Agents
An iTransformer implementation in Rust
Automatically generate gameboy music using machine learning
An NLP-suite powered by deep learning
LLM inference in Rust
Fast BPE algorithm to generate byte pair encodings from text corpus, it's written in rust and approximately 20x faster than it's python implementation
⚙️ Build a modern Vim/Neovim experience in Rust, focusing on performance, Unicode support, and an open framework for collaboration and growth.
General tokenizer library for the Web and Node. Supports Huggingface and Tiktoken formats
AutoCursor is a privacy-focused, GPU-accelerated, context-aware AI autocorrect and autocomplete for Linux written in Rust and Python.
💥Fast State-of-the-Art Tokenizers optimized for Research and Production
A simple automated k8s cluster build on AWS deploying an actix_web reverse proxy API to call an inference of google/pix2struct model that i trained for french road signs, stored in Huggingface.co
Serve Phi3 with Candle and Actix 🦀
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."