minimal workflow engine for data processing (POC)
-
Updated
Apr 26, 2019 - C++
minimal workflow engine for data processing (POC)
Method for searching relevant podcast segments from transcripts using transformer models
Artificial Intelligence (AI) chip with his own hardware design and software. TruthGPT source tree
Optimization of Attention layers for efficient inferencing on the CPU and GPU. It covers optimizations for AVX and CUDA also efficient memory processing techniques.
NanoOWL Detection System enables real-time open-vocabulary object detection in ROS 2 using a TensorRT-optimized OWL-ViT model. Describe objects in natural language and detect them instantly on panoramic images. Optimized for NVIDIA GPUs with .engine acceleration.
Transformers Games by High Moon Studios. Transformers ReEnergized Server Project is looking to Reverse Engineer and bring back the Transformers Multiplayer Servers that were shut down by Demonware Services and revive them for fans.
Transformer and Optimisation based vision position system
Arduino coil and transformer tester and ring detector
Lightweight, header-only Byte Pair Encoding (BPE) trainer in modern C++17. Produces HuggingFace-compatible vocabularies for transformers and integrates with Modern Text Tokenizer.
Code repository for the research paper "Space Efficient Transformer Neural Network"
Poplar implementation of FlashAttention for IPU
Experimental Implementation of Transformer (With Smolgen) on NVIDIA GPU with the help of LibTorch
This is the official implementation for our paper;"LAR:Look Around and Refer".
Whisper in TensorRT-LLM
Aussie AI Base C++ Library is the source code repo for the book Generative AI in C++, along with various other AI/ML kernels.
Tokenizers and Machine Learning Models for biological sequence data
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
CPU inference for the DeepSeek family of large language models in C++
Open source real-time translation app for Android that runs locally
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."