A Julia package providing modular and extensible attention mechanisms for deep learning models.
-
Updated
Aug 18, 2025 - Julia
A Julia package providing modular and extensible attention mechanisms for deep learning models.
Julia Implementation of Transformer models
Model implementation for "Adaptive computation as a new mechanism of dynamic human attention"
Implementation of Single Headed Attention - Recurrent Neural Networks in Julia and Knet
Memory, Attention and Composition (MAC) Network for CLEVR implemented via KnetLayers
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."