Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python ⚡
-
Updated
Nov 19, 2025 - Python
Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applications in Python ⚡
ALBERT model Pretraining and Fine Tuning using TF2.0
Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed support and parameter offload
S + Autograd + XLA :: S-parameter based frequency domain circuit simulations and optimizations using JAX.
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
PyTorch distributed training acceleration framework
Tensorflow2 training code with jit compiling on multi-GPU.
Fast and easy distributed model training examples.
katmer is a powerful library for optimizing the design of optical thin films using automatic differentiation via JAX and Equinox, enabling efficient and accurate inverse design solutions.
Provides code to serialize the different models involved in Stable Diffusion as SavedModels and to compile them with XLA.
Easy to use and blazing fast JAX-based library for high-performance 2D/3D Discrete Element Method (DEM) simulations.
Versatile Data Ingestion Pipelines for Jax
Classification of multilingual dataset trained only on English training data using pre-trained models. Model is trained on TPUs using PyTorch and torch_xla library.
deep learning inference perf analysis
As the quality of large language models increases, so do our expectations of what they can do. Since the release of OpenAI's GPT-2, text generation capabilities have received attention. And for good reason - these models can be used for summarization, translation, and even real-time learning in some language tasks.
Dataloading for JAX
Modern Graph TensorFlow implementation of Super-Resolution GAN
A Multivariate Gaussian Bayes classifier written using JAX
Add a description, image, and links to the xla topic page so that developers can more easily learn about it.
To associate your repository with the xla topic, visit your repo's landing page and select "manage topics."