Ranganath Jasti                                                           Mobile: 510-579-3986
Email: jasti.ranganath@gmail.com
Professional Experience: Dynamic and motivated IT professional with around 15 years of
experience as a Senior Engineer, Technology with expertise in designing data intensive
applications using AWS, Snowflake, Google Cloud platform, Data Warehouse / Data Mart, Data
Visualization, Reporting, and Data Quality solutions with various business sectors like
Investment, Insurance and Telecom domains.
      Expert in Extraction, Transforming and Loading (ETL) data flows using SSIS; creating
       workflows to extract data from SQL Server and Flat File sources and load into various
       Business Entities.
      Large amount of experience in gathering user requirements and translating them into
       technical and system specifications. Managing clients, vendors, and partner relationships.
      Extensive experience for business information systems, focusing on database
       architecture, data modeling, data analysis, programming, and application integration.
      Experience in creating Tables, Views, Stored Procedures, and Indexes, troubleshooting
       database issues and Performance Tuning of Stored Procedures and Database.
      Automated the data loading from Snowflake's internal stage to the Snowflake tables by
       creating client applications in python which calls the Rest endpoints API with the name
       of the snow pipes.
      Created auto sys jobs to run the python applications responsible for loading the data
       automatically on scheduled basis.
      Good experience in migration of data solutions from other databases to Snowflake, GCP
       Big query and AWS Redshift using various AWS services.
      Experience in developing Data Engineering routines using Snowpark API with python.
      Experience with snowflake Multi-Cluster Warehouses and In-Depth knowledge of
       Snowflake database, schema, and table structures.
      Expertise in Snowflake data modelling, ELT using Snowflake SQL, implementing
       complex stored Procedures and standard DWH and ETL concepts using Snowpark API
       with python.
      Experience in building and architecting multiple Data pipelines, end to end ETL and ELT
       process for Data ingestion and transformation using GCP, Storage, Big Query, Dataflow,
       Composer and Data Proc.
      Created tables, views, complex stored procedures, and functions in Big Query for the
       consumption of client applications responsible for BI Reporting and Analytics.
      Leveraged Python and Apache beam and execute it in cloud Dataflow to run Data
       validation between raw source file and Big query tables.
      Used S3 storage for summarized business data and leveraged Athena for SQL queries
       and Analysis.
      Developed data ingestion modules (both real time and batch data load) to data into
       various layers in S3, Redshift using AWS Glue, AWS Lambda and AWS Step Functions.
      Create S3 buckets for data storage in AWS cloud and manage bucket policies and
       lifecycle as per organization’s guidelines.
      Create PySpark Glue jobs to implement data transformation logics in AWS and stored
       output in Redshift cluster.
      Experience in creating Data Governance Policies, Data Dictionary, Reference Data,
       Metadata, Data Lineage, and Data Quality Rules.
Educational Qualification:
                                                                                               1
     Degree of Bachelor of Science (Computer Science) in 2003 from Andhra University.
     One year Post Diploma in Software Engineering in 2004 from NTTF IT Centre, Chennai.
Software Profile:
Operating systems               Windows 10,11
Cloud Related Technologies Snowflake - Snow SQL, Snow pipe, Snowpark API, Rest endpoints API
                                AWS - S3, Lambda, Glue, EMR, Athena, Redshift , GCP - Storage, Big Query,
                                Cloud Dataflow, PySpark, Spark, Hive
BI and Scheduler Tools          SSDT, SSIS, SSRS, SSAS Power BI, DAX, MDX and Auto-Sys
Databases                       SQL server 2008/2012/2016, Oracle 11g
API Integration Tools           MuleSoft Any point studio, RAML, Cloud Hub, Runtime Manager
Programming Languages           VB.Net, C#.Net, ASP.Net, Web services and Python
Internet Technologies           HTTP, REST XML and JSON
Repository Tools                Microsoft TFS 2013, Maven, SVN, GitHub
Virtuous tech Inc - Virginia.                                                        02/2023-
Tilldate
Role: Data Architect
Roles & Responsibilities:
    Responsible for understanding the existing legacy Reporting platform, documenting, and
       translating the requirements into system solutions, and developing the migration plan to
       cloud platform as per the schedule.
    Good experience in migration of data solutions from other databases to Snowflake and
       AWS Redshift using various AWS services.
    Created python scripts which calls the Rest endpoints API with the name of the snow
       pipes for continuous data load from internal stage to snowflake tables.
    Created auto sys jobs to run the python applications responsible for loading the data
       automatically on scheduled basis.
    Implemented the incremental loads inside Snowflake by using Tasks, Streams, Pipes, and
       stored procedures.
    Created complex ETL snowflake stored procedures and functions using Snowpark API
       with python.
    Created complex reporting and analytical snowflake stored procedures and functions
       using Snowpark API with python for the consumption of Mule API’s.
    Experience in building and architecting multiple Data pipelines, end to end ETL and ELT
       process for Data ingestion and transformation using GCP, Storage, Big Query, Dataflow,
       Composer and Data Proc.
    Design and architect various layer of Data Lake and design star schema in Big Query.
    Process and load Data from Google pub/sub topic to Big query using cloud Dataflow
       with Python.
    Created tables, views, complex stored procedures, and functions in Big Query for the
       consumption of client applications responsible for BI Reporting and Analytics.
    Developed data ingestion modules (both real time and batch data load) to data into
       various layers in S3, Redshift using AWS Glue, AWS Lambda and AWS Step Functions.
    Create S3 buckets for data storage in AWS cloud and manage bucket policies and
       lifecycle as per organization’s guidelines.
                                                                                                2
      Create PySpark Glue jobs to implement data transformation logics in AWS and stored
       output in Redshift cluster.
      Used S3 storage for summarized business data and leveraged Athena for SQL queries
       and Analysis.
      Validating the data from SQL server to snowflake to make sure it has Apple to Apple
       match.
Environnent : AWS S3, Lambda, Glue, EMR, Athena, Redshift, Py Spark SQL Server 2016, SSIS,
               Snowflake, Python, Snowpark API, Rest endpoints API, and Auto-sys scheduler,
               Gcp, Big query, Cloud Dataflow, Dataproc, Cloud SQL, pub/sub, Composer.
Invesco – Atlanta.                                                     01/2018-01/2023
Role: Data Architect
GCCP- Global client communication platform is a client reporting application used to generate
various kinds of reports like Client books, Factsheets for Invesco Global institutional clients
located across APAC, EMEA and USA. This system allows the Reporting team, Marketing Team,
Compliance Team for Review\Approvals and finally publish the documents to the external
clients/Invesco website.
Roles & Responsibilities:
    Responsible for understanding the existing legacy Reporting platform, documenting, and
       translating the requirements into system solutions, and developing the migration plan to
       cloud platform as per the schedule.
    Good experience in migration of data solutions from other databases to Snowflake and
       AWS Redshift using various AWS services.
    Created python scripts which calls the Rest endpoints API with the name of the snow
       pipes for continuous data load from internal stage to snowflake tables.
    Created auto sys jobs to run the python applications responsible for loading the data
       automatically on scheduled basis.
    Implemented the incremental loads inside Snowflake by using Tasks, Streams, Pipes, and
       stored procedures.
    Created complex ETL snowflake stored procedures and functions using Snowpark API
       with python.
    Created complex reporting and analytical snowflake stored procedures and functions
       using Snowpark API with python for the consumption of Mule API’s.
    Experience in building and architecting multiple Data pipelines, end to end ETL and ELT
       process for Data ingestion and transformation using GCP, Storage, Big Query, Dataflow,
       Composer and Data Proc.
    Design and architect various layer of Data Lake and design star schema in Big Query.
    Process and load Data from Google pub/sub topic to Big query using cloud Dataflow
       with Python.
    Created tables, views, complex stored procedures, and functions in Big Query for the
       consumption of client applications responsible for BI Reporting and Analytics.
    Developed data ingestion modules (both real time and batch data load) to data into
       various layers in S3, Redshift using AWS Glue, AWS Lambda and AWS Step Functions.
    Create S3 buckets for data storage in AWS cloud and manage bucket policies and
       lifecycle as per organization’s guidelines.
    Create PySpark Glue jobs to implement data transformation logics in AWS and stored
       output in Redshift cluster.
    Used S3 storage for summarized business data and leveraged Athena for SQL queries
       and Analysis.
                                                                                             3
      Validating the data from SQL server to snowflake to make sure it has Apple to Apple
       match.
Environnent : AWS S3, Lambda, Glue, EMR, Athéna, Redshift, Py Spark SQL Server 2016, SSIS,
               Snowflake, Python, Snowpark API, Rest endpoints API, and Auto-sys scheduler,
               Gcp, Big query, Cloud Dataflow, Dataproc, Cloud SQL, pub/sub, Composer.
Invesco – Atlanta.                                                   07/2012-12/2017
Role: Sr Engineer, Technology
Roles & Responsibilities:
    Liaise with EMEA and APAC business subject matter experts in analyzing business
       requirements and translating them into detailed conceptual data models, process models,
       logical models, physical models, and schema development in the database.
    Recommend the appropriate reporting solution to end-users based on need. Assist
       business units with prioritization of outstanding DW development projects.
    Architect the Report data mart. Review and maintain the schema, its tables, indexes,
       views, and functions in SQL server.
    Gathering requirements, designing, and maintaining of ETL process involving data
       quality, testing and information delivery and access to the data warehouse. Coordinate
       Quality Assurance and system testing, assist with systems implementations, and the
       evaluation of the results.
    Interact with vendor (technical issues, project initiatives) independently, as necessary
       and act as the point person for issues escalation.
    Responsible for Application Upgrades and Implementation. Identify new functionality
       and/or hardware requirements. Creates test plans. Responsible for review and validation
       of functionality. Report any problems. Create and/or manage cutover plans including
       downtime.
    Write and analyze complex reports using Desk net, Microsoft power Bl. Make
       modifications to complex reports. Query tuning and troubleshooting of complex SQL
       queries to decrease execution runtimes to produce online reporting.
    Design ETL packages using SSIS for ETL loads, generate Excel extracts on schedule basis.
    Created reports in Power BI utilizing the SSAS Tabular for INVESCO
       marketing/Presentations team.
    Involved in Analyzing, designing, building &, testing of OLAP cubes with SSAS and in
       adding calculations using MDX.
    Extensively involved in the SSAS storage and partitions, and Aggregations, calculation of
       queries with MDX developing reports using MDX and SQL.
    Extensive use of Stored Procedures. Writing new stored procedures, T-SQL, UDF,
       and modifying existing ones and tune them such that they perform well.
    Assisted with documentation and code deployments from Development, QA and
       Production environments.
Environnent : SQL Server 2012, SSIS, SSAS, XML, TFS and MS.Net Web Services and Desk Net
Content Welder 8.2 and Power BI, MDX, DAX.
CSSI – Hyderabad, Ind.                                                      05/2009-06/2012
Role: Sr Software Engineer
                                                                                              4
 VUE Compensation Management is a powerful, flexible, and intuitive tool that makes it easy for
insurance organizations to organize and streamline complex commission and incentive
programs. The system will create the reports and information necessary to manage the business
and build strategic business knowledge. Agent Management, Agent License, Agent
Appointment, Agent Contract, Carrier, Product, Insured, Plan, Financial, Reports, Tracking and
Controlling, Admin and Document Management are the different components Present in VUE.
Roles & Responsibilities:
    Working with business/functional analysts to determine ETL requirements.
    Involved in ETL Process and developed medium to complex 40+ SSIS packages.
    Migrated data from Flat files, CSV files, XML and Excel spreadsheets to SQL Server
       databases using Integration Services (SSIS).
    Extensive use of Stored Procedures. Writing new stored procedures, T-SQL, UDF,
       and addressed performance issues associated with stored procedures and fix them.
       Created proper indexes to support queries.
    Create package configurations, deploy Integration Services projects and schedule SSIS
       packages via jobs.
    Created and maintained reports in SSRS. Report writing using SQL Server Reporting
       Services and creating various types of reports like drill down, Parameterized, Cascading,
       Table, Matrix, Chart and Sub Reports.
    Assisted with documentation and code deployments from Development, QA, and
       Production environments.
Environnent     : SQL Server 2005/2008, SSIS, SSRS, Flat files, XML, CSV, VBScript.
HSBC- Hyderabad, Ind.                                                           06/2008-04/2009
Role: Software Engineer
Broker Unit Database is used to record the data related to the Brokers who get the business to the
company in the form of customers who are referred as Introductions. The main purpose and
scope of the Broker Unit Database System is to record the details of Brokers, record the details of
Professionals, record the details of Introductions and Generating different type of Reports as per
required by Business.
Roles & Responsibilities:
       Involved in the design and development all modules of the application.
       Designed and coded the presentation layer using ASP.NET Web Forms, JavaScript, and
         C#.Net.
       Used ADO.NET for database interactions.
       Extensively wrote stored procedures for database manipulations.
Environnent     : ASP.Net 2.0, ADO.Net 2.0, SQL Server 2005, IIS 5.0, C#.Net, Java Script
Aegis Bpo Services Ltd – Hyderabad, Ind.                                    02/2007-05/2008
Role: Software Programmer
The ‘Verizon Wireless’ owns and operates the nation’s most reliable wireless network.
Headquartered in Bedminster, NJ, Verizon Wireless is a joint venture of Verizon
Communications (NYSE: VZ) and Vodafone (NYSE and LSE: VOD). A Dow 30 company is a
leader in delivering broadband and other communication innovations to wireline and wireless
customers.
Major Responsibilities:
                                                                                                  5
      Involved in ETL Process and developed medium to complex SSIS packages.
      Migrated data from Flat files, Excel spreadsheets to SQL Server databases using
       Integration Services (SSIS).
      Extensive use of Stored Procedures. Writing new stored procedures, T-SQL, UDF,
       and modifying existing ones and tune them such that they perform well.
      Create package configurations, deploy Integration Services projects and schedule SSIS
       packages via jobs.
Environnent   : SQL Server 2005, SSIS, Excel, VBScript.