Skip to content
#

onnx-runtime

Here are 23 public repositories matching this topic...

A curated list of awesome inference deployment framework of artificial intelligence (AI) models. OpenVINO, TensorRT, MediaPipe, TensorFlow Lite, TensorFlow Serving, ONNX Runtime, LibTorch, NCNN, TNN, MNN, TVM, MACE, Paddle Lite, MegEngine Lite, OpenPPL, Bolt, ExecuTorch.

  • Updated May 3, 2024
  • Python

Babylon.cpp is a C and C++ library for grapheme to phoneme conversion and text to speech synthesis. For phonemization a ONNX runtime port of the DeepPhonemizer model is used. For speech synthesis VITS models are used. Piper models are compatible after a conversion script is run.

  • Updated Aug 31, 2025
  • Python

FaceTron is a high-performance face embedding server using ONNX Runtime, supporting dynamic multi-model loading, offline deployment, and scalable environments. It exposes an OpenAPI endpoint with MCP-compatible metadata and integrates with OpenTelemetry for observability.

  • Updated May 11, 2025
  • Python

Improve this page

Add a description, image, and links to the onnx-runtime topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the onnx-runtime topic, visit your repo's landing page and select "manage topics."

Learn more