A collection of TikZ diagrams for neural network and deep learning concepts. NNTikZ is designed to provide clean and consistent figures suitable for academic papers, lecture notes, theses, and presentations. All diagrams are open source, easy to customize, and written entirely in TikZ/LaTeX. Feel free to open an issue to suggest a diagram or submit a pull request with any contributions.
Some example diagrams are shown below:
| Diagram | Preview |
|---|---|
| Transformer | |
| Multi-Head Attention | |
| Neural Network | |
| Attention Mechanism | |
| Gated Recurrent Unit (GRU) | |
| RNN Encoder–Decoder (Sutskever et al.) | |
| Backpropagation Through Time (BPTT) |
If you use NNTikZ in your research or project, cite simply as:
@misc{nntikz,
author = {Fraser Love},
title = {NNTikZ: TikZ Diagrams for Deep Learning and Neural Networks},
year = {2024},
publisher = {GitHub}
url = {https://github.com/fraserlove/nntikz}
}