ASHWINI KRISHNAN
OBJECTIVE
• Seeking to leverage deep expertise in Snap Logic pipelines, API integrations, and enterprise data flow
automation to contribute to innovative, scalable integration strategies in a forward-thinking organization.
• Passionate about streamlining business processes, improving data quality, and delivering high-performance
integration solutions.
SUMMARY
• Experienced Snap Logic Developer with over 7 years of hands-on experience in designing, developing,
and deploying integration pipelines across cloud and on-prem systems
• Strong in database knowledge such as RDBMS Oracle/PLSQL, Snowflake
• Developed multiple pipelines using Snap Logic tool for integrating different sources and target systems.
• Strong working experience in building Snap logic pipelines, error handling, scheduling tasks & alerts.
• Excellent experience in translate functional specifications, user stories into technical specifications.
• Experienced in Redshift Database - creating tables, views, stored Procedures.
• Strong knowledge in Cloud data storage and access using Snowflake
• Development of ETL mappings and workflows according to the use case.
• Good knowledge and experience in API's and Scripting.
• Complete hands-on in Snap logic Designer, Manager and Monitor
SKILLS
• Snap Logic, PL/SQL, RDBMS, AWS Redshift, Data storage, ETL, Azure Cloud, Scripting, AWS S3, Snap
logic pipelines, Scheduling tasks & alerts, SQL, Informatica, Java, Oracle Snowflake, Python, C
Programming, Error handling, Data Structures and Algorithm
CERTIFICATIONS
• Snaplogic Certified Automation Professional
• -Snaplogic Integrator
• -Snaplogic Developer
• BEC Vantage with aggregate
ACADEMIC DETAILS
• Sri Krishna College of Technology – Coimbatore | Bachelor in Computer Science and Engineering - 90%
| 2015 - 2019
PROFESSIONAL EXPERIENCE
Accenture Solutions Pvt Ltd | Nov 2023 - Till Date
Responsibilities:
• Involved in gathering the requirements from the business and designed architecture.
• Solved various performance and complex logic related issues with effective and efficient solution in the
ETL (Extract, Transform and Load) process (performance tuning) and identified various bottlenecks
and critical paths of data transformation in sessions for different Interface.
• Implemented End to End solution for hosting the web application on AWS cloud with integration to S3
buckets.
• Work closely with our architects and engineers to recommend and design database or data storage
solutions that effectively reflect our business needs, security, and service level requirements
• Performed logical and physical data structure designs and DDL generation to facilitate the implementation
of database tables and columns out to the DB2, SQL Server, AWS Cloud (Snowflake) and Oracle DB
schema environment using ERwin Data Modeler Model Mart Repository version 9.6.
• Involved in designing and deploying multiple applications utilizing almost all the AWS stack (Including
EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and
auto- scaling in Amazon Web Services (AWS) Cloud Formation.
• Worked in Client/Server Technology, GUI Design, Relational Database Management Systems RDBMS,
and Rapid Application Development Methodology.
• Primarily involved in coding web pages for dashboards and backend data processing/scripting/cronjob
activities
• Creating Reports in Looker based on Snowflake Connections
• Providing operations support and meeting the business requirements expectations for projects pertaining
to azure cloud (paas and iaas) infrastructure with all project teams and client services groups for
deployment and hosting the environment.
• Data Load from flat files into Oracle tables using SQL*Loader, and PL/SQL procedures.
• Worked on Notification services in setting up the Scheduled jobs and alerts.
• Skilled in building scalable and reusable Snap Logic pipelines to automate data flow between diverse
applications and databases.
• Extensively Performed data validations using Shell scripting, Data integrity before delivering data to
operations
• Implemented error handling and logging pipelines using Error Pipelines, Error View, and Custom error
email alerts.
• Extensively used cursors, ref cursors, user-defined object types, collections, joins, sub queries records, and
tables in Oracle PL/SQL Programming.
• Involved in writing application level code to interact with APIs, Web Services using AJAX, JSON and
XML.
• Reduced data transfer times by optimizing pipeline memory usage and enabling Ultra Pipelines for real-
time processing.
• Automated ETL processes using JSON and XML data formats with Snap Logic expression language and
Mapper Snaps.
Astrazeneca India Pvt Ltd | June 2019 - Nov 2023
Responsibilities:
• Involved in gathering the requirements from the business.
• Work with our current application teams to understand our current applications and make migration
recommendations and to-be architectures in AWS
• Converted user stories into technical specifications.
• Validating the load process of ETL to make sure the target tables are populated according the data mapping
provided that satisfies the transformation rules.
• Worked in T-SQL, stored procedures, user defined functions, views indexes, views, triggers and error
handling
• Used AWS EMR to transform and move large amounts of data into and out of other AWS data stores and
databases, such as Amazon Simple Storage Service (Amazon S3) and Amazon DynamoDB.
• Defined and created API plans that treat the APIs as product offerings and allow several APIs and
resources per plan.
• Implemented the solutions to move on -prem data into a predefined GCP data storage location.
• Preparing the database scripts for DB objects and execute them in AWS Redshift.
• Wrote python and shell scripts to perform operations or tasks that are currently not supported by snaplogic
pipeline
• Developing Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the
functionality for various modules.
• Created Triggered, Scheduled and Ultra tasks in Snap Logic.
• Continuous Integration and Continuous Deployment (CI/CD) of the Applications into Azure Cloud.
• Developed AJAX scripting to process server side JSP scripting.
• Preparing the design documents (HLD, LLD) and source to target column mapping sheet.
• Experience in integration of heterogeneous data sources like Redshift, Sales Force, SAP, Snowflake and
external files.
• Error handling as part of production support on Informatica as well as Unix.
• Design Star and Snow Flake schema and write ETL code to load data to Dimensions and fact tables.
• Building scalable pipelines, map and translate data among SaaS and on- premises systems for both
operational and analytics purposes.
• Fine-tuned many view queries which are running for long time
AWARDS FOR EXCELLENCE
• Winner in AZ-Hackathon- HIC2020.
• Received ‘Star Performer’ for active and effective participation in various DQ frameworks.
• Received Recognition for TEMAQ project.
• Accenture - FY24 APEX Award Winner