This repo is to demonstrate rag data processing pipeline using dataflow flex templates
-
Updated
Jan 2, 2025 - Python
This repo is to demonstrate rag data processing pipeline using dataflow flex templates
This project focuses on scalable data processing and query performance optimisation. It uses Snowflake for data warehousing, GCP Cloud Functions for serverless compute, and Apache Kafka for real-time data streaming. It leverages the serverless capabilities of the systems for scalability and performance.
This project illustrates real-time data processing and analytics. This project uses Apache Kafka for capturing and streaming real-time data, GCP Cloud Functions for processing data in real-time, GCP PubSub for real-time notifications, and GCP Looker Studio for real-time data visualization.
GCP Space Shepherd - service for monitoring Google DataFlow executions
Github action to create dataflow templates
Leveraged GitHub Actions to automate the deployment of a GCP pipeline for Snowflake to BigQuery data migration. Utilized 'sensex-data-analysis' as the data source and Snowflake storage integration feature to load data to GCS. Implemented workflow management and transformation using Composer (Airflow) and Dataflow
Black Friday, the biggest shopping day of the year, presents a unique opportunity for retailers like Walmart to boost sales, attract new customers, and clear inventory. Managing the surge in transaction volumes, understanding customer preferences, and optimizing inventory in real time are critical challenges that require sophisticated data solution
An end to end anime recommendation system based on data scrapped from myanimelist.net
ecommerce GCP Streaming pipeline ― Cloud Storage, Compute Engine, Pub/Sub, Dataflow, Apache Beam, BigQuery and Tableau; GCP Batch pipeline ― Cloud Storage, Dataproc, PySpark, Cloud Spanner and Tableau
A data pipeline to ingest, process, store storm events datasets so we can access them through different means.
Big Data ETL Pipeline for ASL-to-Text (Computer Vision), using Apache Beam on GCP Dataflow
This repo is dedicated for GCP data engineering concepts: BigTable, BigQuery, DataFlow, PubSub, DataProc Spark on GCP. Apache Beam, Apache AirFlow
Playground for Apache Beam and Scio experiments, driven by real-world use cases.
Sample projects to explore various Google Cloud service-offerings and architecture approaches
Apache beam sandbox w/ Dataflow for 10+ use cases
Boilerplate for batch-processing scenarios' orchestration. Apache Airflow w/ realistic product analytics use case
GCP Streaming Data Pipeline for Building Energy Consumption
Trigger a Dataflow job when a file is uploaded to Cloud Storage using a Cloud Function
Add a description, image, and links to the gcp-dataflow topic page so that developers can more easily learn about it.
To associate your repository with the gcp-dataflow topic, visit your repo's landing page and select "manage topics."