Code on selecting an action based on multimodal inputs. Here in this case inputs are voice and text.
-
Updated
Jun 7, 2021 - Python
Code on selecting an action based on multimodal inputs. Here in this case inputs are voice and text.
Unofficial implementation of "Prompt-to-Prompt Image Editing with Cross Attention Control" with Stable Diffusion
A lightweight PyTorch implementation of the Transformer-XL architecture proposed by Dai et al. (2019)
[ITSC-2023] HRFuser: A Multi-resolution Sensor Fusion Architecture for 2D Object Detection
Segment-Like-Me: 1-shot image segmentation using Stable Diffusion
[NeurIPS 2023] Official implementation of the paper "CAST: Cross-Attention in Space and Time for Video Action Recognition"
1-shot image segmentation using Stable Diffusion
This model synthesises high-fidelity fashion videos from single images featuring spontaneous and believable movements.
Clickbait detection using custom cross attention transformer model
The official repository of "Energy-Based Cross Attention for Bayesian Context Update in Text-to-Image Diffusion Models".
Code implementation of Transformer Model in "Attention is All You Need" in PyTorch.
Raw C/cuda implementation of 3d GAN
Transcription factor binding site prediction for novel DNA sequence data aiding in mutation identification and drug discovery
TGRS: Code for "Unsupervised Hybrid Network of Transformer and CNN for Blind Hyperspectral and RGB Image Fusion"
Implementation of CALM from the paper "LLM Augmented LLMs: Expanding Capabilities through Composition", out of Google Deepmind
[ISMB 2024] Official PyTorch Code for "PhiHER2: Phenotype-informed weakly supervised model for HER2 status prediction from WSIs"
Pytorch implementation of CL-ViT and FF-ViT models
IEEE ICME : "Cross-Attention is not always needed: Dynamic Cross-Attention for Audio-Visual Dimensional Emotion Recognition"
Detect Deepfaked Faces Using Multiple Deeplearning Models
Add a description, image, and links to the cross-attention topic page so that developers can more easily learn about it.
To associate your repository with the cross-attention topic, visit your repo's landing page and select "manage topics."