After install requirements, install another dependencies:
pip install pytest numpy Cython pytest-docker psycopg2So, each you run the integration test, a process will create a Kafka, Zookeeper, MongoDB and PSQL for your use. After run tests the infrastructure shut down.
Then run the integration test:
Only test multi driver
python -m pytest -x -s tests/integration/test_multi_driver.py Only test step
python -m pytest -x -s tests/integration/test_step.py Insert data from any survey parsed and assigned by the Sorting Hat
This step performs:
- calculate object statistics
- lightcurve correction
- previous candidates processing
- insert objects
- insert detections
- insert non detections
- None
- Query to get the light curve (detections and non detections).
- Query if object exists in database.
- New detection.
- New non-detection(s).
- Objects
No special conditions, only connection to kafka and database.
DB_HOST: Database host for connection.DB_USER: Database user for read/write (requires these permission).DB_PASSWORD: Password of user.DB_PORT: Port connection.DATABASE: Name of database.
CONSUMER_TOPICS: Some topics. String separated by commas. e.g:topic_oneortopic_two,topic_threeCONSUMER_SERVER: Kafka host with port. e.g:localhost:9092CONSUMER_GROUP_ID: Name for consumer group. e.g:ingestion-stepCONSUME_TIMEOUT: Max seconds to wait for a message. e.g:60CONSUME_MESSAGES: Ammount of messages to consume for each operation. e.g:500TOPIC_STRATEGY_FORMAT(optional): Topic format to format topics that change every day. e.g:ztf_{}_programid1CONSUMER_TOPICS(optional): Topic list to consume. e.g:ztf_*
You must at least use one of TOPIC_STRATEGY_FORMAT or CONSUMER_TOPICS
STEP_VERSION: Current version of the step. e.g:1.0.0STEP_ID: Unique identifier for the step. e.g:S3STEP_NAME: Name of the step. e.g:S3STEP_COMMENTS: Comments of the specific version.
This step require a consumer.
For use this step, first you must build the image of docker. After that you can run the step for use it.
docker build -t ingestion_step:version .You can use a docker run command, you must set all environment variables.
docker run --name ingestion_step -e DB_HOST=myhost -e [... all env ...] -d ingestion_step:versionAlso you can edit the environment variables in docker-compose.yml file. After that use docker-compose up command. This run only one container.
docker-compose up -dIf you want scale this container, you must set a number of containers to run.
docker-compose up -d --scale ingestion_step=nNote: Use docker-compose down for stop all containers.
For each release an image is uploaded to ghcr.io that you can use instead of building your own. To do that replace docker-compose.yml or the docker run command with this image:
docker pull ghcr.io/alercebroker/ingestion_step:latestTo install the required packages run
pip install -r requirements.txtAfter that you can modify the logic of the step in step.py and run
python scripts/run_step.py