🔍 Visualize attention patterns in transformer models to better understand how LLMs process text inputs with interactive heatmaps and comparisons.
-
Updated
Nov 10, 2025 - Python
🔍 Visualize attention patterns in transformer models to better understand how LLMs process text inputs with interactive heatmaps and comparisons.
💥 Optimize linear attention models with efficient Triton-based implementations in PyTorch, compatible across NVIDIA, AMD, and Intel platforms.
Explore Attention-based Deep Learning for brain tumor image analysis. Enhance diagnosis accuracy and efficiency. 🌟🖥️
A set of scripts to generate full attention-head heatmaps for transformer-based LLMs
Vulkan & GLSL implementation of FlashAttention-2
FlashInfer: Kernel Library for LLM Serving
🐙 Implements Flash Attention with sink for gpt-oss-20b; includes test.py. WIP backward pass, varlen support, and community sync to return softmax_lse only.
[ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.
Public dataset from the Human Clarity Institute (HCI)’s 2025 Digital Life Survey, exploring digital behaviour, fatigue, focus, trust, and values alignment across six English-speaking countries
Public dataset from the Human Clarity Institute (HCI)’s 2025 Focus & Distraction Survey, examining attention, productivity, and digital behaviour patterns.
Implementation of Danijar's latest iteration for his Dreamer line of work
A Pomodoro countdown timer that helps you focus by blocking access to distracting websites.
A code deep-dive on one of the key innovations from Deepseek - Multihead Latent Attention (MLA)
Train a GPT from scratch on your laptop
A complete implementation of the "Attention Is All You Need" Transformer model from scratch using PyTorch. This project focuses on building and training a Transformer for neural machine translation (English-to-Italian) on the OpusBooks dataset.
Mostly fighting attention-stealing with ViolentMonkey UserScripts here ✊
A compilation of the best multi-agent papers
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."