πΎ π₯³ π π° π π OpenHands Custom Docker Images π π π π π
This repository contains highly optimized Docker images for OpenHands AI with significant improvements over the original:
- π Multi-Stage Build Process: Boosted build efficiency with optimized Docker layers and caching strategies
- ποΈ Removed OpenVSCode Server: Eliminated unnecessary VSCode server components to reduce complexity and attack surface
- π¦ 30% Size Reduction: Reduced image size from 7GB to 4.92GB (2.08GB savings!)
- β‘ Faster Build Times: Optimized dependency installation and layer caching
- π Enhanced Security: Smaller attack surface with fewer components
| Metric | Original | Optimized | Improvement |
|---|---|---|---|
| Image Size | 7.0 GB | 4.92 GB | 30% reduction |
| Build Time | ~15 min | ~8 min | 47% faster |
| Components | Full VSCode | Minimal | Simplified |
This repository contains custom Docker images for OpenHands AI. Follow these steps to get started:
git clone <repository-url>
cd OHA-ImagesBuild and register a local Docker image for the optimized sandbox runtime:
./start-optimized-sandbox-runtime.shpull-latest-sandbox-runtime.sh: Downloads the latest code and dependencies for the sandbox runtimebuild-sandbox-runtime.sh: Creates the runtime Docker image using the base imagenikolaik/python-nodejs:python3.13-nodejs24-slimstart-optimized-sandbox-runtime.sh: Builds and registers a local Docker image namedall-hands-ai/runtimeusing the optimized Dockerfile
Our optimized build process includes:
- Base Layer: Minimal Python/Node.js environment
- Dependency Layer: Efficient Poetry and micromamba installation
- Application Layer: Clean OpenHands codebase integration
- Final Layer: Optimized runtime with only essential components
- Removed OpenVSCode Server (saved ~800MB)
- Optimized dependency installation order
- Cleaned up build artifacts and caches
- Used multi-stage builds to eliminate intermediate layers
To run the OpenHands CLI using the provided script:
Create a .env file in the repository root with your configuration:
# Required: Your LLM API key
LLM_API_KEY=your_api_key_here
# Required: LLM provider and model
LLM_PROVIDER=anthropic
LLM_MODEL=anthropic/claude-sonnet-4-20250514
# Optional: Container name (defaults to oha-container)
CONTAINER_NAME=my-openhands-container
# Optional: Logging configuration
LOG_LEVEL=DEBUG
LOG_ALL_EVENTS=true
# Optional: Agent features
AGENT_MEMORY_ENABLED=true
AGENT_ENABLE_THINK=true
AGENT_ENABLE_MCP=true
# Optional: LLM configuration
LLM_CACHING_PROMPT=true
LLM_NUM_RETRIES=3
LLM_REASONING_EFFORT=medium
# Optional: Sandbox configuration
SANDBOX_PLATFORM=linux
SANDBOX_ENABLE_GPU=false
# Optional: Search API key (if using search features)
SEARCH_API_KEY=your_search_api_key_herechmod +x run-open-hands.sh./run-open-hands.shThis will:
- Load your environment variables from the
.envfile - Create a Docker network if it doesn't exist
- Run the OpenHands container with your workspace mounted
- Start an interactive CLI session
The script automatically mounts your current workspace directory and provides access to the OpenHands CLI interface.
- Docker installed and running on your system
- Git for cloning the repository
- Bash shell (scripts are written for bash)
- A valid LLM API key (Anthropic, OpenAI, etc.)
If you encounter any issues:
- Ensure Docker is running and you have sufficient permissions
- Check that all scripts have execute permissions:
chmod +x *.sh - Verify you're running the scripts from the repository root directory
- Make sure your
.envfile exists and contains the requiredLLM_API_KEYandLLM_MODELvariables - Check that your API key is valid and has sufficient credits
- Ensure your workspace directory is accessible and has proper permissions