Demo app for a Drogon C++ Backend Application.
It demonstrates the following:
- Authentication using JWT with a simple refresh mechanism and an Auth middleware.
- Password hashing with Argon.
- Database queries with coroutines.
- Pub/Sub system for notifications using ZeroMQ.
- Clustering of geographic coordinates using
postgisPostgres extension. - Media management with S3 compatible Minio Object Store and presigned links.
- Integration tests.
- Frontend:
buyer-frontend - Backend:
buyer-backend(this repository)
Please note working directory and relative paths differs based on build system e.g. MSVC uses Debug/Release folders
Install Postgres on your test machine, either locally or through docker.
vcpkg can be used to manage library dependencies. Here is a guide to installing it.
Below are some of the dependencies that are used in the project:
- drogon - with some submodules and sub-dependencies like JsonCpp.
- argon2
- jwt-cpp
- cppzmq - Pub/Sub, Notification system.
redis-plus-plus- Not currently used or but code stub is present.- boost-uuid - Faster UUID than Drogon UUID utility.
- aws-sdk-cpp - Only using S3 component for S3 compatible Minio Service.
- unordered-dense - Extremely efficient and fast hash map.
- Glaze - Super fast JSON parsing and serializing with reflection support. (~2.1X faster than JsonCpp)
Installing using vcpkg:
# Installing within the project:
# in project root, run this
vcpkg install
# OR
# Installing globally
# triplet_name here depends on your system e.g. x64-windows-static or x64-windows
# you can ignore :triplet_name as defaults are reasonable for most platforms
./vcpkg install drogon[core,ctl,orm,postgres]:triplet_name --recurse
./vcpkg install jwt-cpp:triplet_name
./vcpkg install argon2[hwopt,tool]:triplet_name
./vcpkg install cppzmq:triplet_name
./vcpkg install redis-plus-plus[async,tls,cxx17]:triplet_name # currently not used, so can be ignored.
./vcpkg install boost-uuid:triplet_name
./vcpkg install aws-sdk-cpp[s3]:triplet_name
./vcpkg install unordered-dense:triplet_name
./vcpkg install glaze:triplet_nameMore information on setting up the drogon-ctl tool can be found in this guide
# Configure CMake
# if you installed vcpkg packages globally, you may need to turn off manifest mode
# and should provide the vcpkg.cmake path
cmake -B ./build -S . "-DCMAKE_TOOLCHAIN_FILE=C:/dev/vcpkg/scripts/buildsystems/vcpkg.cmake" -DVCPKG_MANIFEST_MODE=OFF
# OR using manifest mode (packages are automatically installed project level)
cmake -B ./build -S . "-DCMAKE_TOOLCHAIN_FILE=C:/dev/vcpkg/scripts/buildsystems/vcpkg.cmake"
# Build
cmake --build build --config Release --parallel
# clean build
cmake --build build --config Release --parallel --clean-firstA docker-compose.yml file is provided for running the postgres database, media server services and optionally the main application (uncomment it to enable).
You can compose up and down the containers using the following commands:
# Compose up
docker compose up -d
# Compose down
docker compose down
# subsequent runs can be started and stopped using
docker compose start
docker compose stopMinio Object Store is used for storing and serving media files. It has an AWS S3 compatible API. It is managed in this project using Docker.
MINIO_ROOT_USER=minioadmin
MINIO_ROOT_PASSWORD=mypasswordIf you modify the credentials, make sure to update your created config.json and test_config-sample.json files.
-
Modify the configuration files to setup correct backend runtime environment. Create
test_config.jsonandconfig.jsonfrom the provided sample files:config-sample.jsonandtest_config-sample. Fill in the correct details especially the database connection details. -
Startup the database and media server using docker compose. (You can configure postgres manually if you prefer).
-
Run application
$ ./buyer-backend.exe --help
Usage: buyer-backend [OPTIONS]
Options:
--test, -t Run in test mode using test_config.json.
Searches up to 3 parent directories up.
--config <file> Use specified config file.
Searches up to 3 parent directories up.
--help, -h Display this help message and exit.# Run with default config
./buyer-backendBy default it runs on 5555, but can be configured using the configuration file.
# Create Filter
drogon_ctl create filter AuthMiddleware
# Create Controller
drogon_ctl create controller -h api::v1::login
# Create Model - Note that we currently don't use the drogon orm
drogon_ctl create model ./modelsTo see the full list of commands provided by drogon_ctl, use drogon_ctl --help and drogon_ctl help create.
Integration tests using the Drogon built-in test framework.
Integration tests require the main application to be running and uses services in docker-compose.test.yml.
Docker automatically create and seeds the DB with provided migration and seed files.
If you are managing Postgres manually, you can do the following to setup the test DB: (Should be done whenever the test configuration changes.) Create
setup_test_db.batandsetup_test_db.shfrom the provided sample filessetup_test_db-sample.batandsetup_test_db-sample.sh.
Fill in the correct database connection details.
cmake --build build --target configure_tests
# Release build
cmake --build build --config Release --target buyer_backend_test
# Debug build
cmake --build build --config Debug --target buyer_backend_test# Setup test docker compose
docker compose -f docker-compose.test.yml up -d
# Run main application
./buyer-backend --test-mode
# Method 1: Run tests either using ctest...(Slower provides more info)
# From ~/buyer-backend/build/test
ctest -C Release
# or
ctest -C Debug
# Note that -V flag gives verbose output
# This runs a specific ctest
ctest -C Release -R ChatsTest
# Method 2: run test executable directly (Runs faster)
./buyer_backend_test
# This runs a specific test through the test executable
./buyer_backend_test -r ChatsTest
# compose down (remove any associated volume)
docker compose -f docker-compose.test.yml down -v-
Run main application in test mode in separate terminal before running integration tests or else integration tests segfaults.
./buyer-backend --test # or ./buyer-backend -tThe test executable makes requests processed by the main application server. Running main server in test mode ensures that it is connected to the correct test database.
-
If managing postgres manually, ensure that there no active connection to the PostgreSQL DB before configuring tests. This can cause errors in the setup scripts
createdb agentbackend
# or with custom parameters
createdb -h %DB_HOST% -p %DB_PORT% -U %DB_USER% %DB_NAME%psql -U postgres -d agentbackend -c "DROP SCHEMA public CASCADE; CREATE SCHEMA public;"psql -U postgres -d agentbackend -f migrations/001_complete_schema.sql
# test DB ( or auto-generate using the configure_tests cmake target)
psqll -U postgres -d buyer_app_test -f migrations/001_complete_schema.sqlEnsure your Postgres installation has postgis extension support as this migration, creates the extension.
psql -U postgres -d agentbackend -f seeds/001_complete_seed_data.sql
# for test DB ( or auto-seed using the configure_tests cmake target)
psql -U postgres -d buyer_app_test -f seeds/001_complete_seed_data.sqlIf you are managing postgres manually, you can uncomment the custom targets for setting up the test DB in the CMakeLists.txt and test/CMakeLists.txt.
-
If you are getting a 500 error instead of a 400 error, then you are not validating the data coming from the request (body and Headers).
-
If you are getting 200 or 201 error when you aren't supposed to, then you are not validating your input data. Maybe empty data?
-
Note where you add logs. Putting logs in the test side outputs in the test run, logs in the normal application outputs in the server logs
-
Try to initialize the Json values as much as possible. Especially Array responses.
-
Using the
["key"]accessor in JsonCpp default constructs that field if it doesn't exists, prefer.isMember("key")to check for membership. -
If you want to inspect the Database after a test run, remove the teardown delete queries, recompile and rerun tests. This way you have access to it after a run for more better investigations.
-
Use Drogon LOG_INFO and LOG_ERROR macros for logging. For JsonCpp Json responses this helps:
LOG_INFO << json_response.toStyledString();
// or use .type() to check e.g.
if (json_response.type() == Json::ValueType::objectValue)