Deep zoom viewer for gigantic NASA images with AI-powered search and collaborative annotations.
Built for a 36-hour hackathon, Astro-Zoom lets you explore massive astronomical images with smooth deep-zoom navigation powered by OpenSeadragon, find interesting features using CLIP-based semantic search, and annotate regions collaboratively.
- 📤 Image Upload — Upload your own images via web UI with automatic tile processing
- 🔍 Deep Zoom Navigation — Explore gigapixel images with smooth pan/zoom using OpenSeadragon
- 🤖 AI Search — Find features with natural language queries powered by CLIP embeddings
- ✏️ Annotations — Create points, rectangles, and polygons with labels
- ⚖️ Compare Mode — Side-by-side view with synchronized zoom/pan
- ⏱️ Timeline — View temporal changes in image datasets
- 🎨 Modern UI — Dark theme, responsive design, keyboard shortcuts
- 🚀 Production Ready — Docker, rate limiting, authentication, CI/CD
astro-zoom/
├── apps/
│ ├── web/ Next.js 14 + OpenSeadragon viewer
│ ├── api/ FastAPI backend with SQLite
│ └── ai/ CLIP + FAISS semantic search
├── packages/
│ ├── proto/ Shared TypeScript/Python schemas
│ └── ui/ React component library
├── infra/
│ ├── docker-compose.yml
│ ├── tiles/ Sample DZI tile pyramids
│ └── Dockerfile.*
└── .github/workflows/ CI/CD pipelines
Frontend
- Next.js 14 (App Router), TypeScript
- OpenSeadragon (deep zoom)
- Zustand (state), TanStack Query (data fetching)
- Tailwind CSS
Backend
- FastAPI, Uvicorn, SQLModel (SQLite)
- JWT authentication, rate limiting
- DZI tile serving
AI
- CLIP (OpenAI ViT-B/32) or stub fallback
- FAISS (CPU) vector search
- Numpy, Pillow
DevOps
- Monorepo: pnpm workspaces + Turborepo
- Docker Compose
- GitHub Actions CI
- Node.js 20+ (download)
- pnpm 8+ (installed automatically if missing)
- Python 3.11+ (download)
- Docker (optional, for containerized setup)
# Clone the repository
git clone <your-repo-url>
cd astro-zoom
# Run setup script (installs deps, generates tiles)
chmod +x infra/setup.sh
./infra/setup.sh
# Start all services
pnpm devServices will be available at:
- 🌐 Web: http://localhost:3000
- 🔌 API: http://localhost:8000 (docs at /docs)
- 🤖 AI: http://localhost:8001
⚠️ Troubleshooting: If you see "Failed to fetch datasets" error, the API server may not be running. See START_SERVICES.md for help.
# Generate sample tiles first
python3 infra/generate_sample_tiles.py
# Start all services
cd infra
docker compose up --buildWait ~30s for services to start, then visit http://localhost:3000.
The easiest way to add your own high-resolution images is through the web interface:
- Start the services:
pnpm dev - Navigate to http://localhost:3000
- Click "Upload Image" button
- Select or drag-and-drop your image (JPG, PNG, TIFF up to 500MB)
- Enter name and description
- Wait for processing (10-30 minutes depending on size)
- View your dataset!
The system will automatically generate optimized DZI tiles with progress tracking.
Supported formats: JPG, PNG, TIFF • Max size: 500MB • Min dimensions: 256×256px
Each dataset card on the homepage includes a delete button (trash icon):
- Click the delete button on any dataset
- Confirm the deletion
- The system will robustly remove all associated files (tiles, uploads, database entries)
- Deletion works even if the dataset is corrupted or partially missing
Storage Optimization: The system automatically cleans up temporary files:
- Original upload files are deleted after successful tile generation
- Temp directories are cleaned automatically
- Only the optimized tiles and database entries are kept
- This saves 40-60% disk space compared to keeping original uploads
See STORAGE_LOCATIONS.md for detailed storage information.
The project includes mock sample tiles by default. To manually process the real 209MB NASA Andromeda image:
# Install tiling dependencies
pip install -r infra/requirements_tiling.txt
# Process the real image (takes 10-30 minutes)
python infra/process_real_image.pyThis downloads the actual NASA Hubble Andromeda mosaic (42208x9870 pixels) and generates an optimized tile pyramid. See infra/TILE_GENERATION.md for details.
- Visit http://localhost:3000
- Click on "Andromeda Galaxy (Sample)" or "Andromeda Galaxy (NASA Hubble 2025)"
- Use mouse to pan/zoom, or:
- Scroll to zoom
- Drag to pan
- F key to fit image
- G key to toggle grid
- 1 — Explore mode
- 2 — Compare mode (side-by-side)
- 3 — Annotate mode
- F — Fit to viewport
- G — Toggle grid overlay
- Switch to Annotate mode (press
3or click toolbar) - Choose annotation type: Point or Rectangle
- Click on the image:
- Point: Single click
- Rectangle: Click start, then click end
- Annotations save automatically and persist on refresh
- Open the Search Box (left sidebar)
- Enter a natural language query, e.g.:
- "bright star cluster"
- "spiral arm structure"
- "dark dust lane"
- Click results to fly to matching regions
Note: Search uses stub embeddings by default. For real AI search, install
open-clip-torchandtorchinapps/ai.
apps/web/
├── src/
│ ├── app/ Next.js App Router pages
│ ├── components/ React components
│ │ ├── DeepZoomViewer.tsx
│ │ ├── CompareSwipe.tsx
│ │ ├── Annotator.tsx
│ │ └── ...
│ ├── lib/ API client
│ └── store/ Zustand stores
apps/api/
├── app/
│ ├── main.py FastAPI app
│ ├── models.py SQLModel database models
│ ├── routers/ API endpoints
│ │ ├── datasets.py
│ │ ├── annotations.py
│ │ ├── search.py
│ │ └── tiles.py
│ └── seed.py Database seeding
apps/ai/
├── app/
│ ├── main.py AI service
│ ├── clip_stub.py Fallback implementation
│ └── indexer.py FAISS indexing
Web (Next.js)
cd apps/web
pnpm devAPI (FastAPI)
cd apps/api
make dev
# or: uvicorn app.main:app --reload --port 8000AI (Python)
cd apps/ai
make dev
# or: uvicorn app.main:app --reload --port 8001# Build all packages
pnpm build
# Run production web server
cd apps/web
pnpm build
pnpm start
# Run production API/AI (use gunicorn or similar)
uvicorn app.main:app --host 0.0.0.0 --port 8000 --workers 4# Lint all code
pnpm lint
# Typecheck TypeScript
pnpm typecheck
# Python linting
cd apps/api
ruff check app/
# Run tests (if implemented)
pnpm testThe easiest way! Upload through the web interface at http://localhost:3000:
- Click "Upload Image" button
- Select your image (JPG, PNG, or TIFF up to 500MB)
- Enter dataset name and description
- System automatically:
- Validates the image
- Generates multi-resolution DZI tiles
- Creates database entry
- Indexes for search
API Upload:
curl -X POST http://localhost:8000/uploads/upload \
-F "file=@your-image.jpg" \
-F "name=My Dataset" \
-F "description=Optional description"
# Check processing status
curl http://localhost:8000/uploads/status/{dataset-id}For advanced use cases or external tile generation:
1. Create DZI Tiles
Use Vips, OpenSlide, or Python:
# With vips
vips dzsave your_image.tif infra/tiles/my-dataset
# Or use the provided script
python infra/process_real_image.py # Edit paths in script2. Register Dataset
Add to apps/api/app/seed.py:
dataset = Dataset(
id="my-dataset",
name="My Amazing Dataset",
description="High-resolution image of...",
tile_type="dzi",
tile_url="/tiles/my-dataset",
levels=json.dumps([0, 1, 2, 3, 4]),
pixel_size=json.dumps([16384, 16384]),
)
session.add(dataset)3. Build Search Index (optional)
cd apps/ai
python build_index.py my-datasetDemo credentials (for annotation writes):
- Username:
editor - Password:
demo123
Login at /auth/login or via API:
curl -X POST http://localhost:8000/auth/login \
-H "Content-Type: application/json" \
-d '{"username":"editor","password":"demo123"}'Use the returned JWT token in Authorization: Bearer <token> header.
cd infra
docker compose builddocker compose logs -f web
docker compose logs -f api
docker compose logs -f aidocker compose down -v
docker compose up --buildFastAPI automatically generates OpenAPI docs:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
Datasets
GET /datasets— List all datasetsGET /datasets/{id}— Get dataset details
Annotations
GET /annotations?datasetId=X— List annotationsPOST /annotations— Create annotationPUT /annotations/{id}— Update annotationDELETE /annotations/{id}— Delete annotation
Search
GET /search?q=crater&datasetId=X— AI semantic search
Tiles
GET /tiles/{dataset}/info.dzi— DZI descriptorGET /tiles/{dataset}/{level}/{col}_{row}.jpg— Tile image
Uploads
POST /uploads/upload— Upload and process imageGET /uploads/status/{id}— Check processing statusDELETE /uploads/{id}— Delete dataset and tiles
This is a hackathon project, but contributions are welcome!
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing) - Commit your changes (
git commit -am 'Add amazing feature') - Push to the branch (
git push origin feature/amazing) - Open a Pull Request
MIT License - see LICENSE file for details.
- OpenSeadragon — Incredible deep zoom library
- OpenAI CLIP — Semantic image search
- NASA/ESA — Inspiring imagery
- FastAPI — Modern Python web framework
- Next.js — React framework
# Find and kill process using port 3000
lsof -ti:3000 | xargs kill -9
# Or use different ports
PORT=3001 pnpm dev# Remove SQLite lock
rm data/astro.db-shm data/astro.db-walMake sure NEXT_PUBLIC_API_URL matches your API URL:
# .env
NEXT_PUBLIC_API_URL=http://localhost:8000Check that tiles exist:
ls -la infra/tiles/andromeda/
python3 infra/generate_sample_tiles.pyThe stub implementation generates random results. For real search:
cd apps/ai
pip install open-clip-torch torch
python build_index.py andromedaBuilt for a 36-hour hackathon by your team name here.
Happy exploring! 🌌✨