You'll need a Firebase project and Amazon S3 Bucket, copy .env.example
to .env
then put its configurations in .env
Download Service Accounts and save it in data/serviceAccountKey.json
yarn start
With docker-compose
# create postgres_data folder before running compose for persistent data on compose down
# Add -d if you want it to run in background
docker-compose up
# init database
docker-compose run --rm api alembic upgrade heads
# generate test data
# docker-compose run --rm api flask gen testdb
Without docker-compose
Start venv and install python packages (python 3.6)python venv -m .venv
source .venv/bin/activate // Window: .venv\Scripts\activate.bat
pip install -r requirements/base.txt
Database
Update DATABASE_URL
to approciate value based on SQLAlchemy Database URLS
Celery
Update Redis Uri for CELERY_BROKER_URL
and CELERY_BROKER_URL
, alternatively, you can use other Broker
Flask
// Werkzeug
flask run
// Gunicorn
gunicorn "api:create_app()"
// Uwsgi
uwsgi --ini docker/uwsgi/uwsgi.ini
For uwsgi, change socket to http in uwsgi.ini
- Create
postgres_test
folder - Run test compose in detach mode:
docker-compose -f docker-compose.yml -f docker-compose.test.yml up -d
- Run database migrations:
docker-compose run --rm api alembic upgrade heads
- Run init test data:
docker-compose run --rm api flask gen testdb
- Run test:
docker-compose run --rm api pytest
Generate a minium records use for testing
Similar as flask gen testdb
but create full set of data, be aware that it will create a huge mount of records in Images table
Generate a .md
file for each Blueprint, it will read all functions docstring inside the blueprint that has @route
decorator
You can create your own custom scripts and put it under scripts
folder, for scripts requires application context, import init
to push the application context, if you need to access app
, use from init import app
.
MIT. Unless explicitly state otherwise.