a Pytorch version of Gated Attention(GA) Reader model
-
Updated
Aug 20, 2018 - Python
a Pytorch version of Gated Attention(GA) Reader model
Text Summarization using LSTM_Attention, TextRank,PyTextRank, LexRank, Gensim and PyTeaser
WGAN with feedback from discriminator& LayerNorm instead of BatchNorm
PyTorch implementation of "Image Inpainting with Learnable Bidirectional Attention Maps (ICCV 2019)"
A Project Developed by Graham Henderson as part of TU Dublin B.Sc Hons Degree in Computing in Information Technology.
Code for the Paper Constructing Global Coherence Representations:Identifying Interpretability and Coherences ofTransformer Attention in Time Series Data. It is about creating coherence matrices which represent the attention from each symbol to each other symbol.
Official code for "Towards Top-Down Stereoscopic Image Quality Assessment via Stereo Attention"
Image morphing between prompts using Stable Diffusion
TensorFlow implementation of LIC-TCM (Learned Image Compression with Mixed Transformer-CNN Architectures, CVPR 2023 Highlight)
Implementation of Multi-head Latent Attention (MLA) from DeepSeek-V2
Attention Augmented Convolutional Networks, using ResNet, CORnet, Inception and EfficientNet as backbone architectures
This repository contains the code for submission made at SemEval 2022 Task 5: MAMI
Fake News Detection using Hierarchical Convolutional Attention Network
This repository contains the code to reproduce the experiments performed in the Dynamical Mean-Field Theory of Self-Attention Neural Networks article.
Train a GPT from scratch on your laptop
Exploring Human-like Attention Supervision in Visual Question Answering
Implementation of 'Transformer' in the paper 'Attention is all you need' with keras
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."