Pequeno projeto de ETL utilizando Pentaho Data Integration (PDI).
-
Updated
Sep 12, 2019 - TSQL
Pequeno projeto de ETL utilizando Pentaho Data Integration (PDI).
Data Lakehouse course final project (5th semester). This project implements ETL (Extract, Transform, Load) pipeline using Pentaho Data Integration (Kettle) to build a data warehouse focused on new student admissions data from three sources.
This a learning project about implementing ETL pipeline that is scalable with all the DW design principles using Pentaho, MySQL, MS SQL Server.
This is an ETL system designed using Pentaho to transfer multiple data tables between servers in one execution. In each transfer data process, the system can perform other tasks, such as generating a file, and sending generated file to email and/or SFTP.
Datawarehouse and ETL of a bike stores
This Business Case was a project for my University for the module of Business intelligence and visualization about Data warehouses, Relational Databases and ETL.
Bootcamp ministrado pela IGTI com o objetivo apresentar os fundamentos básicos e avançados do ambiente de desenvolvimento do Business Intelligence e realizar práticas nas ferramentas, desde a obtenção dos dados, modelagem do Data Warehouse, ETL, relatórios analíticos e Dashboards.
ETL process using Pentaho Data Integration (Kettle), for Sales and Purchases Datamarts from Adventureworks, as the final project from the Data Management course from the Big Data & Analytics Masters @ EAE class of 2021
Add a description, image, and links to the pentaho-data-integration topic page so that developers can more easily learn about it.
To associate your repository with the pentaho-data-integration topic, visit your repo's landing page and select "manage topics."