Skip to content

xororz/local-dream

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

93 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Local Dream Local Dream

Android Stable Diffusion with Snapdragon NPU acceleration
Also supports CPU/GPU inference

App Demo

About this Repo

This project is now open sourced and completely free. Hope you enjoy it!

If you like it, please consider sponsor this project.

Non-flagship chips are now experimentally supported. It should work on NPUs that are Hexagon V68 or above.

Try our NPU Model Conversion Guide if you want to run your own models on npu.

You can join our telegram group for discussion or help with testing.

πŸš€ Quick Start

  1. Download: Get the APK from Releases or Google Play(NSFW filtered)
  2. Install: Install the APK on your Android device
  3. Select Models: Open the app and download the model(s) you want to use

✨ Features

  • 🎨 txt2img - Generate images from text descriptions
  • πŸ–ΌοΈ img2img - Transform existing images
  • 🎭 inpaint - Redraw selected areas of images
Feature Demo

πŸ”§ Build Instructions

Note: Building on Linux/WSL is recommended. Other platforms are not verified.

Prerequisites

The following tools are required for building:

  • Rust - Install rustup, then run:
    rustup default stable
    rustup target add aarch64-linux-android
  • Ninja - Build system
  • CMake - Build configuration

1. Clone Repository

git clone --recursive https://github.com/xororz/local-dream.git

2. Prepare SDKs

  1. Download QNN SDK: Get QNN_SDK_2.39 and extract
  2. Download Android NDK: Get Android NDK and extract
  3. Configure paths:
    • Update QNN_SDK_ROOT in app/src/main/cpp/CMakeLists.txt
    • Update ANDROID_NDK_ROOT in app/src/main/cpp/CMakePresets.json

3. Build Libraries

🐧 Linux
cd app/src/main/cpp/
bash ./build.sh
πŸͺŸ Windows
# Install dependencies if needed:
# winget install Kitware.CMake
# winget install Ninja-build.Ninja
# winget install Rustlang.Rustup

cd app\src\main\cpp\

# Convert patch file (install dos2unix if needed: winget install -e --id waterlan.dos2unix)
dos2unix SampleApp.patch
.\build.bat
🍎 macOS
# Install dependencies with Homebrew:
# brew install cmake rust ninja

# Fix CMake version compatibility
sed -i '' '2s/$/ -DCMAKE_POLICY_VERSION_MINIMUM=3.5/' build.sh
bash ./build.sh

4. Build APK

Open this project in Android Studio and navigate to: Build β†’ Generate App Bundles or APKs β†’ Generate APKs

Technical Implementation

NPU Acceleration

  • SDK: Qualcomm QNN SDK leveraging Hexagon NPU
  • Quantization: W8A16 static quantization for optimal performance
  • Resolution: Fixed 512Γ—512 model shape
  • Performance: Extremely fast inference speed

CPU/GPU Inference

  • Framework: Powered by MNN framework
  • Quantization: W8 dynamic quantization
  • Resolution: Flexible sizes (128Γ—128, 256Γ—256, 384Γ—384, 512Γ—512)
  • Performance: Moderate speed with high compatibility

NPU High Resolution Support

After downloading a 512 resolution model, you can download patches to enable 768Γ—768 and 1024Γ—1024 image generation. Please note that quantized high-resolution models may produce images with poor layout. We recommend first generating at 512 resolution, then using the high-resolution model for img2img (which is essentially Highres.fix). The suggested img2img denoise_strength is around 0.8.

Device Compatibility

NPU Acceleration Support

Compatible with devices featuring:

  • Snapdragon 8 Gen 1
  • Snapdragon 8+ Gen 1
  • Snapdragon 8 Gen 2
  • Snapdragon 8 Gen 3
  • Snapdragon 8 Elite
  • Snapdragon 8 Elite Gen 5
  • Non-flagship chips with Hexagon V68 or above (e.g., Snapdragon 7 Gen 1, 8s Gen 3)

Note: Other devices cannot download NPU models

CPU/GPU Support

  • RAM Requirement: ~2GB available memory
  • Compatibility: Most Android devices from recent years

Available Models

Now supports importing from local SD1.5 based safetensor for CPU/GPU.

Now you can import your own NPU models converted using our easy-to-follow NPU Model Conversion Guide. And you can also download some pre-converted models from xororz/sd-qnn or Mr-J-369. Download _min if you are using non-flagship chips. Download _8gen1 if you are using 8gen1. Download _8gen2 if you are using 8gen2/3/4/5. We recommend checking the instructions on the original model page to set up prompts and parameters.

Model Type CPU/GPU NPU Clip Skip Source
Anything V5.0 SD1.5 βœ… βœ… 2 CivitAI
ChilloutMix SD1.5 βœ… βœ… 1 CivitAI
Absolute Reality SD1.5 βœ… βœ… 2 CivitAI
QteaMix SD1.5 βœ… βœ… 2 CivitAI
CuteYukiMix SD1.5 βœ… βœ… 2 CivitAI
Stable Diffusion 2.1 SD2.1 ❌ βœ… 1 HuggingFace

🎲 Seed Settings

Custom seed support for reproducible image generation:

  • CPU Mode: Seeds guarantee identical results across different devices with same parameters
  • GPU Mode: Results may differ from CPU mode and can vary between different devices
  • NPU Mode: Seeds ensure consistent results only on devices with identical chipsets

Credits & Acknowledgments

C++ Libraries

Android Libraries

NSFW Detection Model


πŸ’– Support This Project

If you find Local Dream useful, please consider supporting its development:

What Your Support Helps With:

  • Additional Models - More AI model integrations
  • New Features - Enhanced functionality and capabilities
  • Bug Fixes - Continuous improvement and maintenance
Buy Me a Coffee at ko-fi.com εœ¨ηˆ±ε‘η”΅ζ”―ζŒζˆ‘

Your sponsorship helps maintain and improve Local Dream for everyone!

About

Run Stable Diffusion on Android Devices with Snapdragon NPU acceleration. Also supports CPU/GPU inference.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •