Experimental ROCm 6.1 setup for early Qwen7B model tests on AMD GPUs. 🚧
-
Updated
Apr 24, 2025 - Shell
Experimental ROCm 6.1 setup for early Qwen7B model tests on AMD GPUs. 🚧
Adaptive Token Sampling for Efficient Vision Transformers (ECCV 2022 Oral Presentation)
rocm 6.2 Build for Qwen7B Inference – AMD RX 7900, Python 3.12, Safetensors, Multi-LLM Setup
Setup transformers development environment using Docker
Compute the score of similarity between two strings
A collection of Jupyter Notebooks exploring key topics in Artificial Intelligence, including recommender systems, explainable AI, reinforcement learning, and transformers.
O Gamei é uma aplicação open-source para geração de apresentações utilizando Inteligência Artificial, executando completamente local em seu dispositivo. Este projeto oferece controle total sobre seus dados e privacidade, permitindo o uso de modelos como OpenAI, Gemini e outros.
Scripts run to produce the RiboTIE paper
Master's Final Degree Project on Artificial Intelligence and Big Data
HydraNet is a state-of-the-art transformer architecture that combines Multi-Query Attention (MQA), Mixture of Experts (MoE), and continuous learning capabilities.
reference pytorch code for huggingface transformers
Japanese NLP sample codes
RWKV Wiki website (archived, please visit official wiki)
Implementation of the paper: "Audio Mamba: Bidirectional State Space Model for Audio Representation Learning" in pytorch
Deploy KoGPT with Triton Inference Server
Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]
A radically simple, reliable, and high performance template to enable you to quickly get set up building multi-agent applications
Adaptive Token Sampling for Efficient Vision Transformers (ECCV 2022 Oral Presentation)
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."