reference pytorch code for huggingface transformers
-
Updated
Nov 16, 2020 - Shell
reference pytorch code for huggingface transformers
Setup transformers development environment using Docker
Master's Final Degree Project on Artificial Intelligence and Big Data
Adaptive Token Sampling for Efficient Vision Transformers (ECCV 2022 Oral Presentation)
Deploy KoGPT with Triton Inference Server
Japanese NLP sample codes
RWKV Wiki website (archived, please visit official wiki)
Adaptive Token Sampling for Efficient Vision Transformers (ECCV 2022 Oral Presentation)
Code for "Can We Scale Transformers to Predict Parameters of Diverse ImageNet Models?" [ICML 2023]
Scripts run to produce the RiboTIE paper
A collection of Jupyter Notebooks exploring key topics in Artificial Intelligence, including recommender systems, explainable AI, reinforcement learning, and transformers.
Compute the score of similarity between two strings
Experimental ROCm 6.1 setup for early Qwen7B model tests on AMD GPUs. 🚧
rocm 6.2 Build for Qwen7B Inference – AMD RX 7900, Python 3.12, Safetensors, Multi-LLM Setup
O Gamei é uma aplicação open-source para geração de apresentações utilizando Inteligência Artificial, executando completamente local em seu dispositivo. Este projeto oferece controle total sobre seus dados e privacidade, permitindo o uso de modelos como OpenAI, Gemini e outros.
Implementation of the paper: "Audio Mamba: Bidirectional State Space Model for Audio Representation Learning" in pytorch
A radically simple, reliable, and high performance template to enable you to quickly get set up building multi-agent applications
HydraNet is a state-of-the-art transformer architecture that combines Multi-Query Attention (MQA), Mixture of Experts (MoE), and continuous learning capabilities.
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."