Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
-
Updated
Sep 21, 2023 - HCL
Any Airflow project day 1, you can spin up a local desktop Kubernetes Airflow environment AND one in Google Cloud Composer with tested data pipelines(DAGs) 🖥️ >> [ 🚀, 🚢 ]
...an automated data pipeline that retrieves cryptocurrency data from the CoinCap API, processes and transforms it for analysis, and presents key metrics on a near-real-time dashboard
A data engineering project with dbt, Docker, Kestra, Terraform, GCP and Looker.
Completely Serverless ELT platform that can be used for any integration. We just so happen to focus on threat intelligence right now :)
Deploy an open source and modern data stack in no time with Terraform
Final project for DataTalks.Club Data Engineering bootcamp
The ELT pipeline we’ve developed leverages several Google Cloud Services including Google Cloud Storage (GCS), BigQuery, Pub/Sub, Cloud Workflows, Cloud Run, and Cloud Build. We also use dbt for data transformation and Terraform for infrastructure as code.
Production-grade financial data infrastructure platform on GCP — BigQuery, BigTable, Pub/Sub, GKE, Airflow, dbt, Terraform
Refera Challenge (Part 2): dbt transformations for the data warehouse built on AWS Athena. This project models the raw data from the data lake into a star schema with fact and dimension tables. The transformations are deployed via AWS Lambda and scheduled with CloudWatch. Infrastructure is managed with Terraform.
Add a description, image, and links to the dbt topic page so that developers can more easily learn about it.
To associate your repository with the dbt topic, visit your repo's landing page and select "manage topics."