0% found this document useful (0 votes)
119 views6 pages

Mohit Kumar-Profile

Uploaded by

ARPIT VERMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
119 views6 pages

Mohit Kumar-Profile

Uploaded by

ARPIT VERMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Mohit Kumar

Mobile:+91 9654254949
~E-Mail: rakheja.mk@gmail.com
~Blog: msbicoder.blogspot.com

Senior Data Engineer | Azure


With enriched experience of over 12 years in Azure (DataFactory, DataBricks, DataLake, Azure KeyVault, Azure LogicApps,
Azure Synapse, Azure CosmosDB, Azure Functions, BigQuery)

 Experience on Migrating SQL database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Data
Bricks and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise
databases to Azure Data lake store using Azure Data factory.
 Experience in Developing Spark applications using Spark - SQL in Databricks for data extraction, transformation, and
aggregation from multiple file formats for analyzing & transforming the data to uncover insights into the customer
usage patterns.
 Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure
Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services -
(Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.

PROFILE SUMMARY

A dynamic professional with over 12 years of experience in:

Data Factory MSBI (SSIS, SSRS, SSAS) Data Bricks


Azure Data Lake Data Warehousing

 Analyze, design, and build Modern data solutions using Azure PaaS service to support visualization of data. Understand
current Production state of application and determine the impact of new implementation on existing business
processes.
 Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure
Data Factory, T-SQL, Spark SQL, and U-SQL Azure Data Lake Analytics. Data Ingestion to one or more Azure Services -
(Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
 Created Pipelines in ADF using Linked Services/Datasets/Pipeline to Extract, Transform, and load data from different
sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
 Developed Spark applications using PySpark and Spark-SQL for data extraction, transformation, and aggregation from
multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
 Designed highly scalable and fault tolerant, highly available and secured, distributed infrastructure (IAAS) using EC2
instances, EBS, S3, RDS, ELB, Auto Scaling, Lambda, Redshift, DynamoDB etc. in AWS Cloud.
 Possess significant experience of working on different Platforms like Oracle11G (SQL & PL/SQL) and SQL Server
2008/2008R2/2012
 An effective communicator with excellent planning and analytical skills

CORE COMPETENCIES

 Hands - on experience in Azure Cloud Services (PaaS & IaaS), Azure Synapse Analytics, SQL Azure, Data Factory, Azure
Analysis services, Application Insights, Azure Monitoring, Key Vault, Azure Data Lake .
 Very Good Experience in BI Development (SSRS, SSIS, SSAS)
 Knowledge of Excel Macro.
 Fluent in SQL Coding using different software tools such as SQL Developer, TOAD and MS SQL Server Management Studio
2008/2008 R2/2012
 Well versed in using Data Services Designer to ETL from multiple sources including Flat Files, Excel workbooks, XML; SAP
ECC and third-party tools through Data Store setup and custom transfer program
 Well versed in using Performance Tuning techniques to improve Throughput of data
 Fluent in utilizing Data Integrator and Platform transforms, Variables and functions
 Performed Job Validation and Syntax checks and Dry Run to evaluate for errors, execution time and bottlenecks. Utilized
the Debugging function adding Breakpoints, Try and Catch block, Target table Auto-Correct Load and Auto-recovery
mechanisms
 Loaded the test data to estimate job run time and Performance Statistics and optimized and fine-tuned the job. Used Push
Down where applicable.

IT SKILLS

 Conversant with:
o Languages: Spark SQL/Python, UNIX Shell Script, C#, VB.Net, T-SQL
o Database: Oracle11G (SQL & PL/SQL) and SQL Server 2008/2008R2/2012/2014/2016
o Cloud Infrastructure: Azure Data Factory, Data Bricks, Data Lake, Azure SQL Database, Azure SQL
DataWarehouse, Azure KeyVault, Azure LogicApps. Azure Synapse, Azure CosmosDB
o IDE & BI Tools: SQL Server BI Development Studio (BIDS), SAP BODS and MS Visual Studio 2015/2010/2008
o Languages: Java Script and Jquery. Python, Spark SQL, PySpark
o ETL: MSBI, SSIS, SSAS, Sap BODS 4.2
o Reporting: Tableau, PowerBI, SSRS, Crystal Reports
o Database Tools: TOAD, SQL Developer and SSMS (SQL Server Management Studio)
o OS: Linux(RedHat) and Windows Server 2012/2008R2

ORGANISATIONAL EXPERIENCE

2nd Nov’20 to Till Now Assurant Automotive Warranty Solutions (India) Private Limited
23rd Mar’20 to 16th Sep’20 MasterCard India Services Pvt. Ltd.
15th May’19 to 21st Mar’20 RBS (NatWest Markets) as Senior Software Designer
8th April’15 to 15th Mar’19 Fiserv India Pvt. Ltd as Data Architect Analyst
26th Sep’11 to 20th Mar’15 Future Soft India Pvt. Ltd., New Delhi as Developer

Key Result Areas:


 Profiling various information systems, capture source/target mappings and transformations based on project
requirements
 Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to
deliver actionable insights.
 Experience with DI transforms such as Query, Validation, Case transforms as well as DQ transforms such as Match,
Associate, Data Cleanse & Address Cleans
 Created Rich dashboards using Tableau Dashboard and prepared user stories to create compelling dashboards to
deliver actionable insights.
 Responsible for interaction with business stake holders, gathering requirements and managing the delivery.
 Connected Tableau server to publish dashboard to a central location for portal integration.
 Connected Tableau server with share-point portal and setup auto refresh feature.
 Experience in debugging execution errors using Data Services logs (trace, statistics, and error)
 Tuning jobs for high volume and complex transformation scenarios
 Configuring DS components including job server and repositories
 Responsible for PL - SQL, SQL, T-SQL queries, procedures, triggers, cursors and functions
 Performing designing of SSIS packages and SSRS reports for IVR application
 Offering assistance in ongoing projects development & execution
 Managing huge database pertaining to IVR system for CDR & call event
 WSS 3.0: - Installation, Configuration, Registration of Shared Web Parts with the Domain name, linking SSRS Reports
with Configured WSS Portal for client interaction, Creation of User Accounts, Permission for WSS Portal & Trouble
Shooting
 MSSQL Server: 2008 R2/2012: - Installation, Configuration, Creation of Linked Server(Oracle
11/CacheDB/PostgreSQL).
 SSIS: - Installation, Package Configuration, Development & Deployment.
 SSRS: - Installation, Reporting Services Configuration, RDL Development & Deployment.
 Provide support to the design as well as development team in ongoing projects.
 Developing T-SQL queries, trigger, functions, cursors and stored procedures.
 Create views to restrict access to data in a table for security.
 Created ETL packages using Heterogeneous data sources (SQL Server, Flat Files, Excel source files etc.) and then loaded
the data into destination tables by performing different kinds of transformations using SSIS
 Designed and optimized all the SSIS packages.
 Identified and defined the Datasets for report generation and also included report parameters

CERTIFICATION

 DataBricks Certified Data Engineer Associate (Version 3)


 Microsoft Certified Azure Data Engineer Associate (DP-203)
 Microsoft Certified Azure Data Fundamentals (DP -900)
 Microsoft Certified Azure Fundamentals (AZ -900)
 DataBricks Lakehouse Platform Essentials
 DataBricks SQL Analyst Associate
 Python - PCEP 30-02
 MCTS Certified in SQL Server 2008

TRAINING

 Completed training in .Net 3.5 Framework and Avaya Process

EDUCATION

2014 - 2016 MTech (Specialization in Data Analytics) from BITS, Pilani


2006 - 2010 BE (Computer Science Engineering) from Vaish College of Engineering, Rohtak, MDU with 75%

PERSONAL DETAILS

Date of Birth: 23rd Aug, 1988


Address: T5-701, Emaar Gurgaon Greens, Sector 102, Gurgaon - 122006
Languages Known: English, Hindi and Punjabi

Please refer annexure for project details


ANNEXURE

Major Projects Handled:

Title: POCKET GEEK / CONNECTED LIVING


Period: Nov’20 to till now
Client: Assurant Auto
Team Size: 8 members
Platform/OS: Microsoft Azure Platform
Languages: Azure Data Factory, DataBricks, SSIS, Python, PowerBI, SQL Server 2012, Windows Server 2012,
Pyspark, Spark SQL, Python, Kafka
Details: The purpose of this project is to build a metadata driven framework to extract Google Analytics
data from Big Query (firebase env) into the Azure SQL Managed Instance. This project is divided
into three parts: Copying Pocket Geek Auto data from Google Cloud Big Query platform in
parquet format for each day into Azure Data lake (ADL) storage account. Reading parquet file
from ADL storage account and loading into Delta Lake Tables and the final step is to read data
from Delta Lake table, storing it in parquet format on ADLS and loading into Azure SQL Managed
Instance.
Role:

 Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from different
sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
 Developed Spark applications using Spark and Spark-SQL for data extraction, transformation, and aggregation from
multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
 Fetched Streaming data from Kafka using Pyspark and stored it in Delta Tables
 Developing BI reports based on a wide range of data sets by using BI development tools such as SSIS, SSAS, SSRS,
Tableau etc. to ensure reports deliver accurate and meaningful information to our stakeholder.

Title: MERCHANT ANALYTICS / ISSUER ANALYTICS


Period: Mar’20 to Sep’20
Client: Bank across Asia Pacific & US Region
Team Size: 8 members
Platform/OS: Windows Server 2012 r2
Languages: Azure Data Factory, DataBricks, SSIS, Python, PowerBI, SQL Server 2012, Windows Server 2012,
Tableau, PowerBI, Netezza, Hadoop (Hive, Impala)
Details: Merchant Analytics is a data- driven analytical solutions to increase effectiveness, competency,
accuracy, and utility of data across different sectors such as FSBIs, Online Retail and FMCG.
Role:

 Created Pipelines in ADF using Linked Services/Datasets/Pipeline/ to Extract, Transform, and load data from different
sources like Azure SQL, Blob storage, Azure SQL Data warehouse, write-back tool and backwards.
 Developed Spark applications using Spark and Spark-SQL for data extraction, transformation and aggregation from
multiple file formats for analyzing & transforming the data to uncover insights into the customer usage patterns.
 Developing BI reports based on a wide range of data sets by using BI development tools such as SSIS, SSAS, SSRS,
Tableau etc. to ensure reports deliver accurate and meaningful information to our stakeholder.
 Created over 10 reports and dashboard using Sales force for weekly, monthly, quarterly and annual analysis.

Title: FORD
Period: May’19 – Mar’20
Client: NatWest Markets Financial Partners
Team Size: 10 members
Platform/OS: Windows Server 2012 r2
Languages: Azure Data Factory, DataBricks, Oracle 11g, SSIS, SSRS, MS-SQL Server-2008/2012, SSIS, SSAS,
SSRS, Tableau, DotNet, Python, Geneos, Autosys, SVN, JIRA, Confluence
Details: FORD is an Enterprise DataWarehouse Analytical reporting solution provides our Business Users
capability to determine how the trading occurs in the investing Banking products. Different types
of Reports (Escalation Report, Foreign exchange Dashboard, Money Market Dashboard, and
Security/Derivatives Reports)
Role:

 Designed and Configured Azure Cloud relational servers and databases analyzing current and future business
requirements.
 Worked on migration of data from On-prem SQL server to Cloud databases (Azure Synapse Analytics (DW) & Azure
SQL DB).
 Have good experience in setting up separate application and reporting data tiers across servers using Geo replication
functionality.
 Implemented Disaster Recovery and Failover servers in Cloud by replicating data across regions.
 Have extensive experience in creating pipeline jobs, scheduling triggers, Mapping data flows using Azure Data
Factory(V2) and using Key Vaults to store credentials.
 Creating dashboard in Microsoft Reporting Services and PowerBI based on user requirements.
 Working with Cubes and writing relational and multidimensional database queries.
 Data Cleaning and Data Analysis using python libraries like pandas, numpy etc.
 Monitoring automated server tasks and server performance using Geneos.
 Creating automated tasks/jobs in AutoSys.
 Ensure all our code is in line with the team’s best practice/code standards.
 Collaborate with stakeholders both in Operations and Technology.
 Working in scrum and agile methodology, handle end to end release processes

Title: FCRM (Financial Crime Risk Management)


Period: Jan’18 to Mar’19
Client: Bank of America, Goldman Sachs
Team Size: 10 members
Platform/OS: Windows Server 2012 r2
Languages: Oracle 11g, SSIS, SSRS, Tableau, SQL Server 2012/2014, Tableau 8.2, SAP Business Objects Data
Services 4.2, Python, Data Modeling
Details: Financial Crime Risk Management, an anti-money laundering product of Fiserv used by several
financial institutes to detect financial fraud in the system, Aim is to Analyze the business problem
and configure the AML Scenario, RBDD rules, setting up threshold values, monetary value check,
followed by client data profiling, creating architectural designs, data cleaning and Data Analysis,
alert and case generation, Watchlist Filtering and Reporting.
Role:
 Client Data profiling using SSIS for different Customers & Transaction.
 SSIS Package & Dashboard Development.
 Interaction with Client to understand the Business Requirement.
 Creation of metrics, attributes, filters, reports, and dashboards created advanced chart types, visualizations and
complex calculations to manipulate the data.
 Developing unit test cases and mentoring the peer team members on code changes

Title: ARCHITECT (Retail Banking), EPAW, SERVICENOW, CLARITY, OSI


Period: Apr’15 to Mar’20
Client: Bank of America, Goldman Sachs
Team Size: 12 members
Platform/OS: Windows Server 2012 r2
Languages: Oracle 11g, SSIS, SSRS, Tableau, SQL Server 2012/2014, Tableau 8.2, SAP Business Objects Data
Services 4.2, Python
Details: Build a Warehouse for Fiserv Product (Card Services, Bank Solutions, Item Processing)
Role:
 Creation of metrics, attributes, filters, reports, and dashboards created advanced chart types, visualizations and
complex calculations to manipulate the data.
 Developing unit test cases and mentoring the peer team members on code changes
 Worked on Python scripting for mail notification in SAP BODS.
 Worked on Marco for converting files from .doc to .docx & csv to .docx.

Title: Airtel IVR and Reporting System


Period: Since Jul’12
Client: Bharti Airtel India & Africa (Kenya, Uganda, Zambia, Tanzania, Nigeria- Abuja, Lekki& Benin,
Madagascar Malawi, Ghana, Congos/DR, Gabon, SL, Sychelles and Niger)
Team Size: 10 members
Platform/OS: Windows Server 2008 r2, Linux
Languages: Oracle 11g, SQL Server 2008r2, Python, SSIS, SSRS,Tableau, WSS Services, SharePoint Admin.
Details: This project involved fetching of CDR & calls event files from 4 different hubs & load into out
centralized database through SSIS packages, generating SSRS reports IVR analytics, CSAM, CAPA,
billing report’s need on daily basis & publishing the same in WSS (Window Share Point Services)
portal on Windows Server for internal Bharti Team.
Role:
 Accountable for development of several Oracle database objects like table, packages, procedure and functions for data
extraction
 Carrying out Oracle performance tuning of several database objects/packages
 Developing unit test cases and mentoring the peer team members on code changes
 Performing code review of team member
 Preparing stored procedures & triggers, SSIS packages and SSRS reports

Title: VAS (Value Added Services) – Bharti Airtel India


Period: Since Feb’14
Client: AVAYA India Pvt. Ltd.
Team Size: 10 members
Platform/OS: Windows Server 2008 r2, Linux
Languages: Oracle 11g, SQL Server 2008r2, SSIS, SSRS, WSS Services and SharePoint Admin.

Title: Self Service Management Portal (Bharti Airtel India)


Period: Since Sep’13
Client: AVAYA India Pvt. Ltd.
Team Size: 10 members
Platform/OS: Windows Server 2008r2/ Windows 7/IIS7
Languages/Tools: Oracle 11G (SQL, PL/SQL), SQL Server 2008r2, SSIS, SSRS
Details: This project involved Self-service Management Application that is an intranet website for internal
Team, working on different modules (FCP, OICK, DRP-Teasor, CSR-Teasor, ARPU, IMEI, and CTI)

Other Projects:
 Stock Accounting & Reporting System (Nestle India Pvt. Ltd)
 National Electronic Clearing Service (Bank of America)

You might also like