Skip to content
Change the repository type filter

All

    Repositories list

    • docs

      Public
      (deprecated) Source for Midjourney's official wiki
      CSS
      7038920Updated Nov 24, 2025Nov 24, 2025
    • SageAttention

      Public
      Quantized Attention achieves speedup of 2-5x and 3-11x compared to FlashAttention and xformers, without lossing end-to-end metrics across language, image, and video models.
      Cuda
      289000Updated Jul 8, 2025Jul 8, 2025
    • xDiT

      Public
      xDiT: A Scalable Inference Engine for Diffusion Transformers (DiTs) with Massive Parallelism
      Python
      295000Updated Jun 18, 2025Jun 18, 2025
    • long-context-attention

      Public
      USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference
      Python
      74000Updated Jun 5, 2025Jun 5, 2025
    • flash-attention-jax

      Public
      Implementation of Flash Attention in Jax
      Python
      24800Updated Jul 17, 2024Jul 17, 2024
    • nanobind

      Public
      nanobind: tiny and efficient C++/Python bindings
      C++
      271100Updated Mar 15, 2024Mar 15, 2024
    • hf-transformers

      Public
      🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
      Python
      32k400Updated Feb 14, 2024Feb 14, 2024
    • transformers

      Public
      🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
      Python
      32k5600Updated Sep 1, 2023Sep 1, 2023
    • jaxtorch

      Public
      Python
      10000Updated Jun 7, 2023Jun 7, 2023
    • equinox

      Public archive
      Elegant easy-to-use neural networks in JAX. https://docs.kidger.site/equinox/
      Python
      177000Updated May 1, 2023May 1, 2023
    • einops

      Public
      Deep learning operations reinvented (for pytorch, tensorflow, jax and others)
      Python
      390600Updated Dec 29, 2022Dec 29, 2022
    • Fast and memory-efficient exact attention
      Python
      2.2k300Updated Dec 17, 2022Dec 17, 2022
    • jax

      Public archive
      Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
      Python
      3.3k1300Updated Oct 12, 2022Oct 12, 2022
    • flax

      Public archive
      Flax is a neural network library for JAX that is designed for flexibility.
      Python
      769900Updated Sep 17, 2022Sep 17, 2022