[ICLR 2019] ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
-
Updated
Aug 30, 2024 - C++
[ICLR 2019] ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
Declarative way to run AI models in React Native on device, powered by ExecuTorch.
TinyChatEngine: On-Device LLM Inference Library
On-device Neural Engine
Simplified AI runtime integration for mobile app development
A curated collection of LLM-powered Flutter apps built using RAG, AI Agents, Multi-Agent Systems, MCP, and Voice Agents.
Free AI-powered malware scanner for Linux. Detects evasive threats without signatures. Scans offline, open-source CLI with optional cloud intelligence.
BLOCKSET: Efficient out of core tree ensemble inference
High-performance On-Device MoA (Mixture of Agents) Engine in C++. Optimized for CPU inference with RadixCache & PagedAttention. (Tiny-MoA Native)
DrishtiAI is a cutting-edge Android application that brings multimodal vision-language AI capabilities directly to your mobile device. Built with privacy-first design, it enables on-device processing of text, images, and videos using advanced vision-language models (VLMs) without requiring internet connectivity.
Add a description, image, and links to the on-device-ai topic page so that developers can more easily learn about it.
To associate your repository with the on-device-ai topic, visit your repo's landing page and select "manage topics."