Naveen Kumar
Mobile:+91 9080785060
E-mail:naveenkumar.av371@gmail.com
ExperienceSummary
7+ years of IT experience as Snowflake Data Engineer .
Experience Cloud migration – Migrated data from Redshift to snowflake along
withexisting ETL Matillion data pipelines.
Built end-to-end ELT pipelines using Python, Snowpipe, and Airflow to automate data
ingestion
Optimized complex SQL queries and Snowflake workloads to reduce cost and improve
performance
Developed modular and reusable dbt models with macros and YAML-based documentation.
Developed incremental models and SCD logic (Types 1 & 2) for historical data tracking in dbt.
Integrated data from REST APIs, S3 files, and relational databases into Snowflake
environments.
Maintained version control for dbt projects using Git with branching strategies and code
reviews.
Created custom Python utilities for Snowflake DDL automation, schema comparison, and
alerting.
Created reusable mapping templates in Informatica to standardize file ingestion workflows.
Troubleshot data pipeline failures using Airflow logs, Snowflake query history, and AWS
CloudWatch.
Benchmarked Snowflake vs. Redshift for performance and cost optimization insights.
Collaborated with data analysts to create semantic layers and optimize models for BI/reporting.
Develop ETL pipeline ensuing Snowpipe and Matillion.
Experience in working with Snowflake Roles, Databases, schemas, Users, SQL Performance
tuning.
Enforced dbt best practices across the team through code reviews and standardized macros.
Implemented alerting and monitoring for pipeline health and data freshness using custom
sensors.
Working experience in Agile, Scaled Agile Framework-JIRA Confluence ,Jenkins, GIT for
Process .
ProfessionalExperience
WorkingasSnowflakeArchitectinTechMahindrasinceJune2022.
Education
Completed PG from JNT University, Anathapur in 2013.
TechnicalSkillSet
ETL Tools DBT,AWS Glue and AWS EMR, Informatica
Cloud Services Snowflake Environment, and AWS s3, ec2, Redshift, EMR, Kinesis
RDBMS Postgresql,SQL Server , MySQL
Reporting Tools streamlit
Scheduling Tools Airflow
Operating Systems Windows
Languages Python , SQL
ProjectDetails:
Currentproject:
Project :United Health Care
Role :Data Engineer
Duration :May 2020 to Till Date
Description: The project focused on building a cloud-native, scalable, and secure data platform for United
HealthCare to consolidate patient, provider, claims, and billing data into a single source of truth using
Snowflake on AWS. The goal was to modernize legacy systems, improve data availability, ensure regulatory
compliance (HIPAA), and support real-time analytics and reporting for clinical and operational decisions.
Responsibilities:
Created stages like external stage to load data from various sources like S3 and on-premise
systems.
Responsible for all activities related to the development, implementation, administration, and
support of ETL processes for large scale Snowflake cloud data warehouse.
Used import and export from external stage (S3 Bucket).
Writing complex Snowflake SQL scripts in Snowflake cloud data warehouse to Business
analysis and reporting.
Collaborated with cross-functional teams to troubleshoot data integration issues and
implement scalable solutions. And Good exposure on snow pipe configuration for continuous
data ingestion.
Perform troubleshooting analysis and resolution of critical issues.
Involved in Performance tuning of sessions, mappings, PL/SQL procedures.
Involved in data analysis and handling ad-hoc requests by interacting with business analysts,
clients and resolve the issues as part of production support.
Loading data into Snowflake tables from internal/external stages.
Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
Involved in data cleansing to maintain quality and better performance.
Project#2:
Project :Swiggy India Pvt Ltd
Role :ETL/Snowflake Developer
Duration :Aug 2017 to April 2020
Description: Includes Swiggylytics as a product as empowered Swiggy’s engineering, analytics
and product teams to access relevant metrics anytime they need it and make quick, data-backed
decisions to improve the user experience, and build a battery Swiggy for everyone. Ingest data
from multiple sources and store it in structured tables within Snowflake's cloud-based data
warehouse.
Responsibilities:
Migration of Oracle Databases from On- Premises to Snowflake.
Work across multiple functional projects to understand data usage and implications for data
migration.
Migration of Oracle PL/SQL objects to Snowflake.
A logical and critical thinker. You can quickly get up to speed and understand complex
domain knowledge.
Loading data into Snowflake tables from internal stage and on local machine.
Used COPY, LIST, PUT and GET commands for validating internal and external stage files.
Used import and export from internal stage (Snowflake) VS external stage (S3 Bucket).
Performing all SDLC phases to complete ETL development work (Requirement Gathering,
analysis, Design, Unit Testing, Deployment)
Conducted performance tuning and optimization of data processing workflows to enhance
efficiency and scalability.
Worked closely with data analysts and business stakeholders to understand requirements and
deliver data-driven insights.
Participated in the development of data governance frameworks and best practices to
maintain data quality and integrity.
Responsible for task distribution among the team. And Perform troubleshooting analysis and
resolution of critical issues.