0% found this document useful (0 votes)
174 views2 pages

Azure Data Migration Expert Resume

Venkatasuresh veginati provides his contact information and professional summary highlighting his experience with Azure technologies including Azure Data Lake, Azure Data Factory, Azure Databricks, and Azure SQL Server. He has knowledge of OLTP, OLAP, Python coding, data loads, extracts, and ETL processing in Azure. His technical skills include HDFS, MapReduce, Hive, Spark, Python, SQL, and MySQL. He has a B.Sc in computer science and his project experience involves migrating an on-premises data warehouse to Azure using Azure DataBricks, PySpark, AzureDatafactory, and Python.

Uploaded by

Venkata Suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
174 views2 pages

Azure Data Migration Expert Resume

Venkatasuresh veginati provides his contact information and professional summary highlighting his experience with Azure technologies including Azure Data Lake, Azure Data Factory, Azure Databricks, and Azure SQL Server. He has knowledge of OLTP, OLAP, Python coding, data loads, extracts, and ETL processing in Azure. His technical skills include HDFS, MapReduce, Hive, Spark, Python, SQL, and MySQL. He has a B.Sc in computer science and his project experience involves migrating an on-premises data warehouse to Azure using Azure DataBricks, PySpark, AzureDatafactory, and Python.

Uploaded by

Venkata Suresh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Venkatasuresh veginati

Mobile: 9000403491 mail:veginatisuresh1998@gmail.com

PROFESSIONAL SUMMARY:

 Having knowledge on Azure Data Lake, Azure Data Factory, Azure Databricks, Azure SQL Server and
Database, SQL Data Warehouse, Azure Blob, Azure Storage Explorer.
 Experience managing Big Data platform deployed in Azure Cloud.
 Having knowledge on OLTP and OLAP using Azure Data factory and Databricks to Data lake.
 Experience in Managing and storing confidential credential in Azure Key vault.
 Good experience in Python code for data loads, extracts, in Azure Databricks.
 Implemented multiple activities and custom Pipelines in Azure Data Factory for On-cloud ETL processing.
 Experience in integrating Hive queries into Spark environment using Spark SQL.
 Hands-on experience in using Transformation ETL techniques like Talend to extract the data from external
repositories and Internal sources.
 Profound Knowledge on cloud Confidential like Azure Services
 Capable of working in new technologies with a minimum learning curve.
 Hands-on experience in python programming and Spark components like Spark-core
and Spark-SQL
 Worked on creating the RDDs, DFs for the required input data and performed the data
transformations using Spark-core.
 Hands-on experience in dealing with Apache Hadoop components like HDFS, Map Reduce, Hive, Sqoop and
in-depth knowledge of Oozie.
 Having good analytical, problem solving and communication skills.

ACADAMIC PROFILE:

 Completed Degree B.sc computer science 2019.

TECHNICAL SKILLS:

Big data Technologies : HDFS, MapReduce, Hive , Spark,


Coding Language : Python, SQL
Data Base : MySQL.
Operating systems : windows
Project 1:
Project Title : Data platform Migration to Azure
Environment : Azure DataBricks, PySpark, AzureDatafactory,Python

DESCRIPTION:
This project focuses on migrating the on-premises Enterprise Data Warehouse to Cloud using Microsoft Azure
technologies. It will include rewriting the whole solution using Azure technologies, migrating all the data from various
disparate legacy source systems, interacting with Business getting and finalizing business requirements and finally
integrating and testing the whole solution.
RESPONSIBILITIES:
 Primarily involved in Data Migration using SQL, SQL Azure, Azure Data lake, and Azure Data Factory.
 Responsible for extracting the data from OLTP and OLAP using Azure Data factory and Databricks to
Data lake.
 Used Azure Databricks notebook to extract the data from Data lake and load it into Azure and On-prem SQL
database.
 Developed pipelines that can extract data from various sources and merge into single source datasets in
Data lake using Databricks.
 Performed encryption of sensitive data in to Data lake which are sensitive to business using cypher algorithm.
 Decrypt the sensitive data using keys for refined Datasets for analytics, by providing end users access.
 Created connections from different sources from On-prem and cloud to only source for Power BI reports

You might also like