Lauren's personal website
-
Updated
Jul 24, 2025 - TeX
Lauren's personal website
A Project Developed by Graham Henderson as part of TU Dublin B.Sc Hons Degree in Computing in Information Technology.
Implementation of Multi-head Latent Attention (MLA) from DeepSeek-V2
NER for Chinese electronic medical records. Use doc2vec, self_attention and multi_attention.
Simple from-scratch implementations of transformer-based models that match the state of the art.
Resources and references on solved and unsolved problems in attention mechanisms.
N4: Nova Nox Neural Network official pytorch implementation. A simplified Selective copying mechanism + Attention mechanism.
This website provides information about a and reproducible Docker-based environment for training and integrating infant behavior models using the CST (Cognitive System Toolkit). It was developed as part of a tutorial presented at the ICDL (International Conference on Development and Learning).
Public dataset from the Human Clarity Institute (HCI)’s 2025 Focus & Distraction Survey, examining attention, productivity, and digital behaviour patterns.
Public dataset from the Human Clarity Institute (HCI)’s 2025 Digital Life Survey, exploring digital behaviour, fatigue, focus, trust, and values alignment across six English-speaking countries
Published at Frontiers in Psychology - Cognition (https://www.frontiersin.org/articles/10.3389/fpsyg.2019.02833/full)
Neural Machine Translation with Attention - Keras and TensorFlow 1.X
PyTorch implementation of transformers with multi-headed self attention
This repository contains a list of papers, open-sourced codes, and datasets in the field of Document-level NMT.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."