Event-driven system with two independent, containerized services:
- Producer: Generates random number events and publishes to Kafka
- Consumer: Subscribes to events and persists them to NDJSON file with deduplication
- Event-driven architecture using Apache Kafka
- Independent deployable services across different environments
- Contract-first development with JSON Schema validation
- Structured logging with correlation IDs
- Configurable via environment variables
See quickstart guide for detailed setup instructions.
- Docker
- Python 3.11+ (for local development)
- Start Kafka:
docker run -d --rm --name kafka \
-p 9092:9092 \
apache/kafka:latest- Install dependencies:
pip install -e ".[dev]"- Run both services with docker-compose:
# Start all services (Kafka + Producer + Consumer)
docker-compose -f docker-compose.integration.yml up
# Or run in background
docker-compose -f docker-compose.integration.yml up -d
# Stop services
docker-compose -f docker-compose.integration.yml downAlternatively, run services individually:
make run-producer # In one terminal
make run-consumer # In another terminal# Unit tests
pytest tests/unit/
# Contract tests
pytest tests/contract/
# Integration tests
pytest tests/integration/Producer:
KAFKA_BOOTSTRAP_SERVERS(default:localhost:9092)GEN_INTERVAL_SECONDS(default:1)
Consumer:
KAFKA_BOOTSTRAP_SERVERS(default:localhost:9092)OUTPUT_PATH(default:./data/output.jsonl)
Built following the Events Architecture Constitution:
- Event-driven communication via Kafka
- Containerized services with Docker
- Contract-driven interface definitions
- Independent cross-environment deployment
- Structured observability
See implementation plan for technical details.