Attention-based video classifier running on accelerated attention approximations
-
Updated
Jul 28, 2022 - Python
Attention-based video classifier running on accelerated attention approximations
Creating transformer (encoder/decoder) from scratch. Also experimented with alibi encodings as opposed to positional
This repository uses modules from Swin Transformer to build Transformer-based Generative Adversarial Networks (GANs) models.
Deep Learning Methods for Environmental Audio Classification
Transformer implementation in tensorflow
Code for "The Locality and Symmetry of Positional Encodings" EMNLP Findings
A step-by-step implementation of a GPT-style language model on a combined harry potter novels dataset, inspired by Andrej Karpathy’s lecture.
Codebase for paper accepted in ICIP 2025
This Repository contains my learning in the space of AI and Gen AI
A Transformer-based neural network that decodes movement intentions in real time from EEG, EMG, and IMU signals. It classifies intended actions and sends control signals to external actuators, such as robotic arms or electrical muscle stimulation systems.
Simple character level Transformer
A small language model trained on famous literary works in Japanese.
Transformers from scratch implemented GQA,RoPE,RMS-Norm and trained on that code
Physics-Informed Meta Graph Transformer for Travel State Estimation via Traffic Density
Repository for Yowlumne data and scripts for WIELD
Deep Classiflie is a framework for developing ML models that bolster fact-checking efficiency. As a POC, the initial alpha release of Deep Classiflie generates/analyzes a model that continuously classifies a single individual's statements (Donald Trump) using a single ground truth labeling source (The Washington Post). For statements the model d…
Repository for a transformer I coded from scratch and trained on the tiny-shakespeare dataset.
A sandbox for fast image classification experiments. Define a custom network architecture using provided building blocks in seconds, then run an experiment!
Neural-Symbolic-Superintelligence scaling
Add a description, image, and links to the transformer-architecture topic page so that developers can more easily learn about it.
To associate your repository with the transformer-architecture topic, visit your repo's landing page and select "manage topics."