ncnn is a high-performance neural network inference computing framework designed specifically for mobile platforms. It brings artificial intelligence right at your fingertips with no third-party dependencies, and speeds faster than all other known open source frameworks for mobile phone cpu. ncnn allows developers to easily deploy deep learning algorithm models to the mobile platform and create intelligent APPs. It is cross-platform and supports most commonly used CNN networks, including Classical CNN (VGG AlexNet GoogleNet Inception), Face Detection (MTCNN RetinaFace), Segmentation (FCN PSPNet UNet YOLACT), and more. ncnn is currently being used in a number of Tencent applications, namely: QQ, Qzone, WeChat, and Pitu.
Features
- Supports most commonly used CNN networks
- Supports convolutional neural networks
- Supports multiple input and multi-branch structure
- Absolutely no third-party dependencies
- Cross-platform
- ARM NEON assembly
- Low memory footprint
- Supports multi-core parallel computing acceleration
- Supports GPU acceleration
- Small library size
- Extensible model design
- Supports direct memory zero copy reference load network model
- Can be registered with custom layer implementation and extended
Categories
Artificial Intelligence, Mobile, Image Recognition, Neural Network Libraries, Deep Learning Frameworks, LLM InferenceLicense
BSD LicenseFollow ncnn
Other Useful Business Software
MongoDB Atlas runs apps anywhere
MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of ncnn!