0% found this document useful (0 votes)
367 views6 pages

Prashant-Sr. ETL Consultant

Prashant Sharma provides his contact information and then summarizes his 11 years of experience developing ETL solutions using tools like Informatica PowerCenter, Informatica Data Replication, SSIS, and Teradata utilities. He has worked in insurance, retail, telecom and other industries. His experience includes ETL architecture, development, performance tuning, SQL development, and working with data warehouses, databases, and business intelligence tools.

Uploaded by

Prashant Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
367 views6 pages

Prashant-Sr. ETL Consultant

Prashant Sharma provides his contact information and then summarizes his 11 years of experience developing ETL solutions using tools like Informatica PowerCenter, Informatica Data Replication, SSIS, and Teradata utilities. He has worked in insurance, retail, telecom and other industries. His experience includes ETL architecture, development, performance tuning, SQL development, and working with data warehouses, databases, and business intelligence tools.

Uploaded by

Prashant Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Prashant Sharma

Email: prashantcapriconian@gmail.com
Phone: 612-799-9085

Experience Summary

 Having 11 years of ETL architecture and development experience including: Developed ETL solutions using various
tools like Informatica PowerCenter 9.x/10.x, Informatica Data Replication 9.5, Informatica Data Quality suite, SSIS
and Teradata tools and utilities.
 Creating metadata reports in Data Analyzer to run when a PowerCenter session completes.
 Well versed with Teradata tools & utilities like BTEQ, Fload, Mload, TPT, etc to automate batch processes and
achieve better data loading performance.
 Sound understanding of EDW architecture and components.
 Experience in Cognos BI tools using Cognos Framework Manager, Report studio, Query studio, Analysis studio,
PowerPlay Transformer, Impromptu Web Reports (IWR).
 Worked in medium to large data warehouse environments and well versed with basic EDW concepts like star
schema and dimensional modeling.
 Well defined analytical skills and extensively worked with finding reasons for gaps in data result sets using complex
SQL etc.
 Worked in Domains like Insurance (Manulife and Aviva), Retail, telecom, etc.
 Performance Tuning of Informatica ETL jobs and SQL queries to reduce overall batch time.
 Well versed with production deployment and migration, performance/capacity testing and QA support.
 Profound understanding of Data Warehouse, ETL, and business intelligence best practices (Informatica best
practices).
 Project management tools like JIRA and Source tree, PVS Kintana, etc.
 Strong experience in performance tuning and debugging of Informatica mappings.
 Ability to develop and analyze complex SQL and PL/SQL scripts.
 Extensive use of UNIX shell scripting and basic commands.
 Good understanding of agile methodology for project implementation.
 Analysis, design, development, and implementation of data warehouse, ETL, client/server, and mainframe
transactional applications.
 An effective communicator with strong interpersonal, analytical & problem-solving skills.

Technical Experience Summary

ETL and Reporting Tools Informatica PowerCenter 9.x/10.x, Informatica Cloud Services, Informatica Data
Replication v9.5, Informatica Data Quality 9.5.1, DataStage, Pentaho and SSIS

Operating Systems: IBM Mainframe, UNIX (Sun Solaris, AIX, HPUX, Linux)
Versioning Tools JIRA, Source Tree, HEAT, PVS Kintana
Databases/EDI tools: Teradata v13 and v14, Oracle v11, Microsoft SQL Server
Teradata Utilities SQL Assistant, BTEQ, FastLoad, Multiload, TPump, TPT, SQL developer.
Source Systems Relational Tables, XML/XSD files, Cobol Files and CSV Flat files
Languages: Advanced SQL, PL/SQL, UNIX shell scripts
Professional Experience

MoneyGram International (Insurance) March 2018 – Till Date


Senior ETL Consultant (through Apex Systems)

Responsibilities:

 Senior ETL developer for Data integration projects at Moneygram


 ETL design and development of initial load and incremental mappings and type 2 mappings using Informatica
Power Centre 9.5 and 10.x.
 Loading Oracle tables using simple to complex Informatica ETL mappings.
 Well experienced in defining, designing, integrating and re-engineering the Enterprise Data warehouse and Data
Marts in different environments like Teradata and Oracle with multiple Terabytes of size and various levels of
complexity.
 Teradata performance optimization using Explain Plan and other DB tuning methods.
 Optimizing long running queries in Teradata v13 using SQL Assistant.
 Using BTEQ scripts to perform pre-and post-Database operations like dropping and recreating Indices on target
tables using BTET sessions.
 Assisting QA while testing applications using IDQ data console.
 Creating migration documents for ETL code developed.
 Scheduling jobs using ESP scheduling tool and fixing Production issues.
 Data quality control using Data Director tool and Analyst tool
 Designed import ETL jobs using Human Task component, which provides real time user control to fix reference
data on the go.
 Following project standards like release and change management.
 Maintaining documentations like mapping specifications document, ETL design document, etc.
 Distributing time among multiple projects by contributing as a technical advisor for ETL design and scope
evaluation.

Aviva (Insurance) November 2016 – Feb 2018


Senior ETL Consultant (through Veritaaq)

Responsibilities:

 Real Time Systems for Reference Data management Project for the RBC Integration project.
 ETL development using Informatica Data Quality and Informatica Analyzer tool
 Overlooking design and development of initial load and incremental mappings and type 2 mappings using
Informatica Power Centre 9.5 and 10.x.
 Loading Teradata target tables using Informatica TPT connections and BTEQ Scripts.
 Well experienced in defining, designing, integrating and re-engineering the Enterprise Data warehouse and Data
Marts in different environments like Teradata and Oracle with multiple Terabytes of size and various levels of
complexity.
 Expertise in working in Teradata V2R12.x systems and used utilities like MultiLoad, FastLoad, FastExport, BTEQ,
TPump, Teradata SQL.
 Teradata performance optimization using Explain Plan and other DB tuning methods.
 Optimizing long running queries in Teradata v13 using SQL Assistant.
 Using BTEQ scripts to perform pre-and post-Database operations like dropping and recreating Indices on target
tables using BTET sessions.
 Assisting QA while testing applications using IDQ data console.
 Creating migration documents for ETL code developed.
 Scheduling jobs using AG Zena and fixing Production issues.
 Data quality control using Data Director tool and Analyst tool
 Designed import ETL jobs using Human Task component, which provides real time user control to fix reference
data on the go.
 Following project standards like release and change management.
 Maintaining documentations like mapping specifications document, ETL design document, etc.
 Distributing time among multiple projects by contributing as a technical advisor for ETL design and scope
evaluation.

Technical Environment:
Infromatica Data Quality, SSIS, Power Center 10.1.1(Hotfix1), Teradata Tools and Utilities v13., Cognos Imporomptu
Web Reorting (IWR) studio.

Manulife Financial (Insurance) August 2015 until October 2016


Senior ETL Consultant (TEKsystems)

Responsibilities:

 Valuation Systems Transformation (VST) envisions to establish an optimized, consistent and streamlined
valuation environment across the organization; with an aim to provide more reliable and robust core reporting,
that ensures timely, value-added insights to support business management decisions ETL development using
Informatica Cloud Services and Teradata v13.
 Migration of SSIS packages to Informatica ETL jobs.
 Acted as a team lead for Canadian Universal Line involving extracting data from flat file sources and loading into
SQL server. Developing incremental and historical logics for Type 2 mappings using Informatica Power Centre 9.5
 Processing source systems like relational tables, flat files and XML files.
 Optimizing ETL and database performance by implemented standard techniques like session partitioning,
database partitioning, collect statistics, etc.
 Creation of customized Mload scripts on UNIX platform for Teradata loads and writing complex Teradata SQL
queries for adhoc business requirements.
 Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster
user query performance
 Accountable for Architect related deliverables to ensure all project goals are met within the project time lines
 Following project standards like release and change management. Maintaining documentations like mapping
specifications document, ETL design document, etc.
 Interact with business to collect critical business metrics and provide solution to certify data for business use
 Distributing time among multiple projects by contributing as a technical advisor for ETL design and scope
evaluation.
 Supporting the QA team by fixing defects in codes and documentation.
 Documenting ETL design and solutions and preparing testing scripts for dry runs.
 Processing ETL rules written in Fortran/Perl and implementing the solution in Informatica Cloud Services and
eventually reconciling the data between outputs from various systems.

Technical Environment: Infromatica Cloud Services, SSIS, Datastage, SQL Server 2012, DB2

Aviva (TCS) June 2014 until August 2015


Sr. ETL Consultant

Responsibilities:

 Developing Informatica Data replication configurations for CSTP EDW initiative to extract data from Guidewire
systems and loading into EDW landing zone from where the data is further used for doing claim and commercial
line analytics.
 Processed claims, billing center and policy level data from Guidewire tables. Developed related PowerBuilder
objects like (windows, datawindows, UserObjects) and Database Objects like (Stored Procedures, packages,
functions, types, Tables, views etc.) using Oracle PLSQL.
 Set up reports in Informatica Data Analyzer to run when a PowerCenter session (or a workflow) completes.
 Acted as a team lead for Case Manager project involving extracting data from DB2 sources and loading into
Teradata 3NF data warehouse.
 Involved in design and architecture initiatives for various projects at Aviva Canada.
 Perform analysis on business requirements, KPI and provide solution to certify data for business use
 Analyse the current data movement process and procedures
 Interact with technical and business analyst, operation analyst to resolve data issues
 Creation of BTEQ, Fast export, MultiLoad, TPump, Fast load scripts for extracting data from various production
systems
 Developing incremental and historical logics for Type 2 mappings, using Informatica Power Centre 9.5
 Developing BTEQ shell scripts to accomplish batch jobs and automation of ETL process.
 Using Teradata Parallel Transporter connection to accomplish faster data loading process in Informatica session
connections.
 Following project standards like release and change management.
 Maintaining documentations like mapping specifications document,ETL design document, etc. Scheduling IDR
jobs using ASG Zena.
 Distributing time among multiple projects by contributing as a technical advisor for ETL design and scope
evaluation.

Technical Environment:
Informatica Power Center 9.5, SSIS, Oracle 11g, Teradata v13 (tools and utilities), ASG Zena, BTEQ, FLoad, Mload
and TPT Scripts.

Toronto District School Board March 2014 until July 2014


Sr. ETL Consultant

Responsibilities:

 Study the existing Elementary Student achievement DataMart developed through PLSQL package.
 Gathered/Documented information/business rules about existing Enrollment Data warehouse from Source
system expert
 Designing migration solutions for PLSQL to Informatica migration of legacy codes.
 Worked on new design using only Informatica tool for Enrollment Data Warehouse.
 Functional requirement gathering
 Define the schema, staging tables, and landing table, configuring base object (customer, address, and
Organization), foreign-key relationships, query groups and queries.
 Create the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router,
Filter, and Update Strategy
 Create Application Detail Design Document based on the new design Create source to target mapping
specification document.
 Review Design with Subject matter expert/source system expert.
 Develop Complete ETL processes (Sources/Targets/mappings/sessions/workflows/mapplets/reusable
transformations/lookups etc.) using Informatica Designer PowerCenter version 9.5.1.
 Perform thorough unit testing and generating unit testing artifacts as a part of overall documentation process.
 Review ETL code with team lead and making changes as per feedback.
 Participated in UAT testing: giving UAT KT to business users by apprising them with various test scenarios.
 Promote code to Test environment and maintaining code versions using release management systems.
Technical Environment:
Informatica Power Center 9.5, Pentaho, Oracle 11g, Toad, SQL Server, Microsoft Visio

WIND Mobile, Toronto, ON August 2013 until March 2014


Sr. BI Analyst

Responsibilities:
 Supporting reporting requirements and functional design with Wireless functional areas such as Billing, Revenue,
traffic, CRM, Sales, Marketing, POS, commissioning and Oracle EBS as required
 Extract, filter, format and analyze OLTP information from Singleview systems using Informatica Analyzer 9.1.1
and presents information in easy-to-understand reports.
 Informatica PowerCenter 9.1.0 ETL design and development.
 Using Informatica PowerCenter Designer, Workflow monitor, designer and repository manager.
 ETL and SQL Performance tuning by identifying bottlenecks.
 Writing BTEQ scripts to automate batch jobs and TPT scripts to enhance data loading process at Informatica
session level.
 Also developed Fload and Mload scripts to facilitate better data movement into Teradata.
 ETL SIT administration and UAT support.
 Provide direction on Business Intelligence/ data architecture /Data Warehouse technology and methodology.
 Create Informatica mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup,
Router, Filter, and Update Strategy following project code standards set by architect team.
 Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and
summarize data.
 High proficiency in performing data analysis for complex source systems and data models.
 Analyze the business requirements to design, develop and implement highly efficient, highly scalable SQL &
PL/SQL scripts and reports.
 Writing BTEQ scripts to automate batch jobs and TPT scripts to enhance data loading processes.
 Also developed Fload and Mload scripts to facilitate better data movement into Teradata.
 Perform data mappings, design of ETL workflow solutions Develop, and implement ETL solutions to meet business
and project requirements.
 Monitoring the scheduled production of extracts daily to identify and report failures with the data warehouse
load.
 Analyzing and correcting data extract scripts.
 Develop and analyze complex SQL and PL/SQL scripts Data Modeling, Relational DB design and Architecture Work
with other Data warehousing development teams.

Technical Environment:
Informatica Power Center 9.5, Informatica Analyzer 9.5.1, Oracle 11g, Teradata Database and Utilities, UNIX,
Control M

Loblaw, Mississauga, ON August 2011 till Jan 2013


ETL Developer (Contract)

Responsibilities:

 Post production implementation support responsibility


 Customized Database objects like PL/SQL Procedures, Functions, and Packages
 Create UNIX Scripts to Manipulate and Load the data
 Assisted in Batch processes using Fast Load, BTEQ, UNIX Shell and Teradata SQL to transfer cleanup and
summarize data.
 Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator,
Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Unconnected lookup transformations.
 Providing strategic and tactical direction to the team
 Responsible for Technical implementation of the change request solution
 Supporting enterprise data warehouse and ETL development activities
 Analysis of change request
 Working in the Production Support environment
 Designing the technical solution of Business requirement
 Analysis of the possibilities of solution enhancement
 Developed UNIX Shell Scripts for scheduling the sessions in Informatica
 Implement the slowly changing dimensions (SCD) type1 and type2 to maintain current information and history
information in the dimension tables

Technical Environment:
BODS Xi (data integration tool for ETL), Oracle 11g, Teradata 13, Sun Solaris, Control M, Informatica Power Centre

CISCO Systems July 2006 – June 2011


Informatica/ETL Developer

Responsibilities:

 Implementing business logic for source to target data loading. Implementing business solutions as per
requirement specification.
 Understanding the various layers of data transformation and loading process involved.
 Unit testing and documenting ETL mapping developed for a interface.
 Creation of BTEQ, FLoad, Mload scripts wherever applicable, to facilitate data loading process. Gathering the
requirements from client/onsite counterpart and assigned work to offshore.
 Developing ETL using Teradata Utilities i.e.-BTEQ, Loader-FastLoad, Multiload, Tpump and Informatica mappings
for loading data into the target system.
 Optimization of Teradata and ETL codes Developing Design Documents, Deployment Plan, ETL Specifications,
Business Requirement documents.

Technical Environment:
Informatica Power Center 9.5, Oracle 11g, Datastage 7.x, Teradata v12, MS Visio, Tidal Scheduling.

Education Summary

Bachelor of Engineering – 2006, Gyan Ganga Institute of technology & Sciences, Jabalpur, MP, India.

You might also like