MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.
-
Updated
Apr 10, 2026 - C++
MNN: A blazing-fast, lightweight inference engine battle-tested by Alibaba, powering high-performance on-device LLMs and Edge AI.
Open3D: A Modern Library for 3D Data Processing
An open source library for face detection in images. The face detection speed can reach 1000FPS.
PaddlePaddle High Performance Deep Learning Inference Engine for Mobile and Edge (飞桨高性能深度学习端侧推理引擎)
Tengine is a lite, high performance, modular inference engine for embedded device
The Compute Library is a set of computer vision and machine learning functions optimised for both Arm CPUs and GPUs using SIMD technologies.
The official repository for the gem5 computer-system architecture simulator.
C++ image processing and machine learning library with using of SIMD: SSE, AVX, AVX-512, AMX for x86/x64, NEON for ARM, HVX for Hexagon
The OpenSource Disassembler
DOSBox Staging is a modern continuation of DOSBox with advanced features and current development practices.
A translator from Intel SSE intrinsics to Arm/Aarch64 NEON implementation
HLE 3DS emulator
Simple hook tool to change Win32 program font.
Genode OS Framework
Math library using HLSL syntax with multiplatform SIMD support
Very vulnerable ARM/AARCH64 application (CTF style exploitation tutorial with 14 vulnerability techniques)
TinyChatEngine: On-Device LLM Inference Library
Add a description, image, and links to the arm topic page so that developers can more easily learn about it.
To associate your repository with the arm topic, visit your repo's landing page and select "manage topics."