minimal workflow engine for data processing (POC)
-
Updated
Apr 26, 2019 - C++
minimal workflow engine for data processing (POC)
Arduino coil and transformer tester and ring detector
Optimization of Attention layers for efficient inferencing on the CPU and GPU. It covers optimizations for AVX and CUDA also efficient memory processing techniques.
Code repository for the research paper "Space Efficient Transformer Neural Network"
Transformer and Optimisation based vision position system
NanoOWL Detection System enables real-time open-vocabulary object detection in ROS 2 using a TensorRT-optimized OWL-ViT model. Describe objects in natural language and detect them instantly on panoramic images. Optimized for NVIDIA GPUs with .engine acceleration.
Experimental Implementation of Transformer (With Smolgen) on NVIDIA GPU with the help of LibTorch
Method for searching relevant podcast segments from transcripts using transformer models
Artificial Intelligence (AI) chip with his own hardware design and software. TruthGPT source tree
Lightweight, header-only Byte Pair Encoding (BPE) trainer in modern C++17. Produces HuggingFace-compatible vocabularies for transformers and integrates with Modern Text Tokenizer.
Poplar implementation of FlashAttention for IPU
Transformers Games by High Moon Studios. Transformers ReEnergized Server Project is looking to Reverse Engineer and bring back the Transformers Multiplayer Servers that were shut down by Demonware Services and revive them for fans.
Whisper in TensorRT-LLM
Aussie AI Base C++ Library is the source code repo for the book Generative AI in C++, along with various other AI/ML kernels.
Tokenizers and Machine Learning Models for biological sequence data
This is the official implementation for our paper;"LAR:Look Around and Refer".
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
CPU inference for the DeepSeek family of large language models in C++
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."