Sampath Kumar
Professional Summary:
• ETL Developer with 8+ years of experience in Travel, Financial, Telecommunication and Retail
domains.
• Extensive experience in implementing Microsoft BI/Azure BI solutions using Azure Data Factory,
Azures Synapse, Power BI, SQL Server Integration Services (SSIS) and SQL Server Reporting
Services(SSRS).
Worked in ETL and data integration in developing ETL mappings and scripts, guided team for the
transformations and all aspects of SDLC that includes requirements gathering, analysis, design, and
development.
Hands on experience on various SQL Server Services like Integration Services (SSIS), Analysis
Services (SSAS), and Reporting Services (SSRS).
Experience with Azure transformation projects and implement ETL data movement solutions using
Azure Data Factory and SSIS.
Experience in recreating existing application and functionality logic in Azure Data Factory, Azure SQL
Database, Azure Data Lake and SQL data warehouse environment.
Experience in building data pipelines using Azure Data Factory, and loading data to Azure Data Lake,
Azure SQL Database.
Knowledge on migrating data from on-prem SQL server to cloud databases like Azure Synapse
Analytics (DW) and Azure SQL DB.
Experience in creating pipeline jobs, scheduling triggers, mapping dataflows using Azure Data
Factory (ADF) and using key vaults to store credentials.
Experience in logging defects in Azure, Devops and Jira tools.
Strong PL/SQL programming for data population and table alterations, Oracle Performance
tuning/ SQL tuning and query optimization skills.
Proficient in writing Transact SQL queries that perform both DDL and DML operations involving
creation of Tables, Views, Indexes, File groups, Partitions, performing inserts, updates, deletes using
Joins, Sub queries, and adding constraints on the tables.
Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS
Package Designer.
Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL
Database, Data Bricks and Azure SQL Data warehouse and controlling and granting database
access and Migrating On premise databases to Azure Data lake store using Azure Data factory.
Knowledge in using scripting languages like python and pyspark in Azure Function App, Azure
Synapse notebook.
Experience on data integration tools such as Azure Data Factory or similar Microsoft data integration
tools; Azure SQL, Datalakes and Data Warehouse solutions.
Hands on experience, on Azure Data Factory (ADF), data pipelines and extracting the secure files via
SFTP Gateway, in Azure Data Factory (ADF).
Technical Skills/Additional Information:
• SDLC Methodologies: Waterfall, AGILE, Rational Unified Process (RUP), Use Cases, JAD/RAD Sessions,
Requisite Pro, DOORS, Rational Rose, Borland CaliberRM
• Data Modeling Tools: ER/Studio 8.5, Erwin 3.5/ 4.1.4/ 7.5.3, Power Designer 12.1, Rational Rose, and
Visio (UML)
• RDBMS: Teradata13/12/V2R6/V2R5, Oracle (11g/10g/9i/8i/7.x), SQL Server
(2008R2/2008/2005/2000/7.0), DB2, MySQL, Access, SSRS, SSIS, MS Dynamics AX
• Data Warehousing: Informatica Power Center 8.1/8.0/7.1/7.0/6.2/6.1/5.2, Informatica Power Mart
4.7, Power Connect, Power Exchange, Data Profiling, Data cleansing, OLAP, OLTP, SQL*Plus, T-SQL
Sampath Kumar
• Quality Assurance: Business and Software Process and Procedure Definition, Quality Models and
Quality tools (Pareto analysis, histogram Process), Measures and Metrics, Project Reviews, Audits
and Assessments.
• Azure Tools: Azure Data Factory (ADF), Azure Data Bricks, Azure Data Lake, Azure Devops, Azure
SQL. Azure Synapse.
Professional Experience:
W.B. Mason, Brockton, MA March 2021- Present
ETL Developer
Responsibilities:
• Responsible for designing SSIS packages and Azure Data Factory pipelines for the data extraction,
transformation, loading from the multiple data source systems to the target system.
• Bulit triggers using logic app to export and import files from SFTP/FTP to Datalake storage and vice
versa.
• Used change tracking to upsert the data from on-prem SQL database to Azure SQL database, to load
delta records.
• Created pipelines in ADF using Linked Services/Datasets to perform extract, transform and load data
from different sources like Azure SQL, Blob storage, Azure SQL Data warehouse.
• Created Azure function app and written a function in python using both HTTP trigger and Azure blob
storage trigger.
• Created synapse notebook using pyspark code to upload a file from datalake to FTP server.
• Point of contact for all AX related queries and issues and communicate to team lead ongoing issues
and statuses and respond to internal support queries, such as data integrity and validation
questions.
• Respond to internal support queries, such as data integrity and validity questions.
• Worked on creating pipeline jobs, scheduling triggers, mapping dataflows using Azure Data Factory
(ADF) and using key vaults to store credentials.
• Responsible to use SSIS/Azure Data Factory, In extracting files via SFTP Gateway.
• Working with the team to elicit, document and finalize business requirements like process modeling,
data, business rules and functions.
• Creating source and target table definitions using SSIS. Source data was extracted from Flat files, SQL
Server.
• Experienced in developing Power BI reports and dashboards from multiple data sources.
• Responsible for migrating the existing SSIS packages into Azure Data Factory.
• Used stored procedure, lookup, data flow, conditional statements, copy data, execute pipeline
functions in ADF.
• Load data from on-prem SQL database to sharepoint using SSIS.
• Worked on migrating data from datalake storage to Azure SQL Synapse analytics (DW).
• Used external tools like Kingsway soft, Task factory in SSIS to connect CRM via CRM/CDS connection
manager.
• Created Stored Procedures,Views, Indexes, Functions in Sql Server.
• Created external tables in Azure Synapse Serverless pool.
Environment: SQL Server 2017, SQL Server Integration tools (SSIS), Azure Analytic Services: Azure Data
Factory, Azure Data Lake, Azure Devops, Visual Studio 2019, Azure DataBricks, Python, Pyspark, Azure
Synapse, XRM toolbox, Kingsway soft, Task Factory.
FCM Travel Solutions, Marlborough, MA November 2017-March 2021
ETL Developer
Responsibilities:
Sampath Kumar
• Worked with the offshore Microsoft AX Dynamics (D365) team and development team regularly to
communicate issues with the source data.
• Worked with the third-party vendors, offshore and onshore teams to resolve any client escalated
queries or production issues.
• Responsible for replacing the existing AX system with the new system using ETL.
• Research and validate the source for data discrepancies.
• Working with the team to elicit, document, and finalize business requirements like process
modeling, data, business rules and functions.
• Use SSIS/SSRS for the data Extraction, Transformation, Loading from the multiple data source
systems to the target system.
• Designed and developed various SSIS packages (ETL) to extract and transform data and involved in
Scheduling SSIS Packages.
• Developed and Optimized Stored Procedures and Functions using T-SQL.
• Created and scheduled SQL Agent jobs and maintenance plans.
• Converted Data Transformation Services (DTS) application to SQL Server Integrated Services (SSIS) as
assigned.
• Designed SSRS Reports using T-SQL queries based on business requirement.
• Filtered, purged, and cleansed dirty data from legacy system using complex T-SQL statements in
staging area, and implemented various constraints and triggers for data consistency.
• Extensively used Calculations, Variables, Breaks, Sorting, and Alerts for creating Business Objects
reports.
• Used Azure Data Factory (ADF), In creating the Pipeline to extract the data from the source and
loading into the target.
• Use of Azure Data Factory (ADF), In built function using the stored procedure, Copy, delete and
conditional functions.
• Use of Azure Data Factory (ADF), in Extracting the secure files via SFTP Gateway.
• Creating Spark clusters and configuring high concurrency clusters using Azure Databricks to speed up
the preparation of high-quality data.
• Transform data by running a Python activity in Azure Databricks.
Environment: SQL Server 2017, SQL Server Integration tools (SSIS), SQL Server Reporting Services (SSRS),
Jira, Azure Analytic Services: Azure Data Factory, Azure Data Lake, Azure Devops, Visual Studion2017,
Azure DataBricks
Frontier Communications, Richardson, TX September 2016 to October 2017
ETL Developer
Responsibilities:
• Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and
Mart.
• Developed ETL mapping and PL/SQL code as per the business requirements.
• Created mappings using pushdown optimization to achieve good performance in loading data into
Netezza.
• Deployed, Tested and Scheduled SSIS packages using SQL Agent.
• Managed the project in breaking down the existing EDW to manageable data marts.
• Wrote Scheduler Scripts in UNIX to run the developed PL/SQL scripts. Loaded data from flat files into
tables by using the SQL Loader.
• Worked with type 1 and 2 dimensions, Fact tables, Star Schema design, Operational Data Store
(ODS) leveling and other Data Warehouse concepts.
• Responsible for designing and optimizing various T-SQL database objects like tables, views, stored
procedures, functions, indexes, and triggers using Management Studio and SQL Profiler.
Sampath Kumar
• Creation and maintenance of Informatica users and privileges.
• Worked on application of formal data model descriptions using data modeling techniques.
• Migration of Informatica Mappings/Sessions/Workflows from Dev, QA to Prod environments.
• Created Groups, roles, privileges and assigned them to each user group.
• Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master
Data Management Architecture involving OLTP, ODS and OLAP.
• Worked on load balancing to perform operations such as heterogeneous data access, data integrity
using (TPT) Teradata Parallel Transport
• Worked on SQL queries to query the Repository DB to find the deviations from Company's ETL
Standards for the objects created by users such as Sources, Targets, Transformations, Log Files,
Mappings, Sessions and Workflows.
• Built and supported database models to support ETL, reports, dashboards, reports and BI packages.
• Troubleshooting and debugging the issues evolved during the project execution by providing the
root cause analysis (RCA).
Environment: Informatica 9.1, Teradata 14, Power exchange (CDC), PL/SQL, Informatica developer IDQ
8.6.1, Netezza, Apache spark, IBM web sphere, Oracle 10g, SAP, Cognos, T-SQL,BO BI4.0, HDFS, SSIS,
Unix-AIX, HP ALM 10, HP SM, Erwin 4.0, MS Visio.
MassMutual, Springfield, MA September 2015 to August 2016
ETL Developer
Responsibilities:
• Involved in the Design, Development, Testing phases of Data warehouse.
• Coordinate with application team and develop ETL processes.
• Provide an efficient interface with brand for all processes across various platforms.
• Developed and tested Stored Procedures, Functions, and packages in PL/SQL.
• Ensured participation of appropriate resources in the scoping, planning and completion of all agile
projects.
• Created Shared and Distributed Catalogs for the Users, also Created Calculations, conditions and
prompts in catalogs by using Oracle Procedures and Packages.
• Write functional documentation (use cases, interfaces, functionalities, data models).
• Validates functional documentation with the client.
• Scheduling, monitoring, coding, and testing of custom PL/SQL modules to load data into the data
warehouse from the legacy systems.
• Designed Datastage jobs using Quality Stage stages in 8.0 for data cleansing & data standardization
Process.
• Implemented Survive stage & Match Stage for data patterns & data definitions.
• Administered DB2 subsystems in Prod, Qual, Test and Development regions.
• Worked on Informatica debugging using expression evaluator.
• Created and debugged large T-SQL database scripts and batches to facilitate effective database
management.
• Created the Source and Target Definitions in Informatica Power Center Designer
• Conducted Database testing to check Constrains, field size, Indexes, Stored Procedures, etc.
• Involved in writing stored procedures and shell scripts for automating the execution of jobs.
• Evaluate all functional requirements and map documents and perform troubleshoot on all
development processes.
• Design all test cases to provide support to all systems and perform unit tests.
Environment: Informatica 8.6, Java, IBM web sphere, Oracle 9i, T-SQL, SQL, PL/SQL, SSIS, Teradata
SQL, CDC, Windows XP, MS-Access.
Sampath Kumar
Education:
Master’s in information systems from University of Mary-Hardin Baylor