0% found this document useful (0 votes)
422 views6 pages

Saraswati K DA

The document is a resume for Saraswati Khatri. It summarizes her as a senior data analyst with over 10 years of experience in data analysis, business intelligence, and working with cloud data platforms. Her technical skills include tools like Power BI, SQL Server, Azure, Snowflake, Python and languages like SQL. Her most recent role is as a Senior Data Analyst at the Federal Aviation Administration where she designs dashboards and reports to improve aviation safety through data analysis.

Uploaded by

HARSHA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
422 views6 pages

Saraswati K DA

The document is a resume for Saraswati Khatri. It summarizes her as a senior data analyst with over 10 years of experience in data analysis, business intelligence, and working with cloud data platforms. Her technical skills include tools like Power BI, SQL Server, Azure, Snowflake, Python and languages like SQL. Her most recent role is as a Senior Data Analyst at the Federal Aviation Administration where she designs dashboards and reports to improve aviation safety through data analysis.

Uploaded by

HARSHA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

SARASWATI KHATRI

Data Analyst | Data Engineer | Business Intelligence | Azure Cloud Data Engineer
+1 (913) 353-8724 | saraswatikhatri1988@gmail.com

Professional Summary:
 Senior Data Analyst with around 10+ years of experience in Insurance, healthcare and banking domains and logistics, with
expertise in Agile methodologies, business process analysis, and system documentation.
 Demonstrated ability to lead agile ceremonies, drive backlog grooming sessions, and mentor developers to ensure timely
delivery of high-quality software products.
 Skilled in conducting stakeholder interviews, performing gap analysis, impact analysis, and root cause analysis, and creating
various types of project documentation like BRD, FRD, SRS, and Release Notes.
 Proficient in SQL for querying and analysing data from various sources, designing test plans and test cases for UAT, and
using tools like JIRA, and HP QC/ALM for bug and defect management.
 Experienced in working with various EDI X12 transactions like 270, 271, 834, 835, and 837, and complying with
government regulations like HIPAA.
 Strong understanding of managed care payer requirements and procedures, HEDIS benchmarking, EMR implementation,
and post-implementation support.
 Skilled in using data visualization tools like Tableau, Power BI for blending data from multiple sources, linking data on
one dashboard, and filtering data in multiple views at once.
 Demonstrated ability to work with Microsoft Power BI for data analysis and dynamic visualization, driving informed
decision-making and enhancing business insights.
 Experience in Data Integration and Data Warehousing using various ETL tools Informatica PowerCenter, AWS Glue,
SQL Server Integration Services (SSIS), Talend, Azure Data Factory.
 Experience in Designing Business Intelligence Solutions with Microsoft SQL Server and using MS SQL Server
Integration Services (SSIS), MS SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS).
 Proficient in Microsoft SharePoint, Business Intelligence Analytics, SQL Server Analysis Services, and Performance
Point Services for database solutions and analytics.
 Experience working with Amazon Web Services (AWS) cloud and its services like Snowflake, EC2, S3, RDS, EMR,
VPC, IAM, Elastic Load Balancing, Lambda, RedShift, Elastic Cache, Auto Scaling, Cloud Front, Cloud Watch,
Data Pipeline, DMS, Aurora, ETL and other AWS Services.
 Strong expertise in Relational Data Base systems like Oracle, MS SQL Server, Teradata, MS Access, DB2 design and
database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL - LOADER. Highly proficient in writing, testing
and implementation of triggers, stored procedures, functions, packages, Cursors using PL/SQL.
 Hands on Experience with AWS Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple
source system which include loading nested JSON formatted data into Snowflake table.
 Demonstrated ability to work with cross-functional teams, facilitate daily scrum meetings, enforce time-boxes, and respond
to impediments to ensure smooth project execution.
 Developed responsive websites using HTML, CSS, JavaScript, and PHP and utilized databases to store customer
information and order history, improving the user experience by allowing customers to view their purchase history and
reorder easily.
 Extensive experience in Data Mining solutions to various business problems and generating data visualizations using
Tableau, Power BI, Alteryx
 Proficient in leveraging Microsoft Teams to facilitate seamless collaboration, communication, and project coordination
among stakeholders and team members.
 Utilized Python's statistical libraries, including NumPy and pandas, to perform in-depth data analysis and extract
meaningful insights also worked with Camunda to model and optimize complex business processes, enhancing efficiency
and streamlining workflows and expanded data visualization capabilities through Python libraries like Matplotlib,
Seaborn, and Plotly, creating compelling visual representations of data.
 Expertise in using GitHub for effective code review, version control, and collaborative documentation, promoting efficient
software development practices.
 Familiarity with Salesforce CRM, enabling efficient customer relationship management and enhancing business operations
within the industry context.
 Adept at integrating R with Azure Machine Learning services to build and deploy predictive models at scale.
 Experienced in working with Azure Databricks to implement Python-based data transformations, machine learning
workflows, and advanced analytics.
 Experience with ETL workflow Management tools like Apache Airflow and have significant experience in writing the
python scripts to implement the workflow.
 Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using
Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals
and transformation tuning.
 Worked on performance tuning of user queries by analysing the explain plans, recreating the user driver tables by right
Primary Index, scheduled collection of statistics, secondary or various join indexes.

1
 Experience with scripting languages like PowerShell, Perl, Shell, etc.
 Expert knowledge and experience in fact dimensional modelling (Star schema, Snowflake schema), transactional modelling
and SCD (Slowly changing dimension).
 Create clusters in Google Cloud and manage the clusters using Kubernetes(k8s). Using Jenkins to deploy code to Google
Cloud, create new namespaces, creating docker images and pushing them to container registry of Google Cloud

Technical Summary:

Reporting Tools Power BI, Tableau, SQL Server Reporting Services (SSRS), Crystal Reports, SQL
server Management Studio, Power BI report Builder.
ETL Tools SQL Server Integration Services (SSIS), Azure Data Factory, DBT
Cloud Azure Data Lake, Azure Data Factory, Data Bricks, Snowflake, Azure Synapse, Azure
DevOps, AWS Snowflake, AWS RDS, AWS Aurora, Redshift, EC2, EMR, S3,
Lambda, Glue, Data Pipeline, Athena, Data Migration Services, SQS, GCP, Kubernetes,
Docker
Databases SQL Server Analysis Services, SQL Server 2008 R2, 2012,2014, 2016,2020,2022,
MySQL, Oracle, MongoDB, PostgreSQL

Operating Systems Windows, UNIX, Linux


Tools Jira, MS Access, Word, Excel, TFS
Languages SQL, PL/SQL, Python Unix Shell Scripting, T-SQL

Work Experience:

Senior Data Analyst Dec 2021 – Till Date


Federal Aviation Administration
Project Overview:
The "Aviation Safety Data Enhancement Project" within the Federal Aviation Administration (FAA) focuses on improving
aviation safety and operational efficiency through data analysis. This initiative involves collecting, integrating, and
analysing aviation data to identify safety trends, optimize air traffic management, and enhance decision-making. By creating
data visualization tools and reports, the project aims to proactively manage risks and improve operational efficiency.
Collaboration with cross-functional teams and effective communication of findings are key components of this project.
Ultimately, it seeks to ensure safer and more efficient air travel within the FAA.
Responsibilities:
 Designed and crafted interactive dashboards, insightful reports, and compelling visualizations using Power BI, catering
to a wide range of business requirements. As an expert user of Business Intelligence (BI) products, I also developed
new and innovative approaches, methodologies, and techniques to define and direct challenging activities.
 Engaged proactively in sessions for gathering requirements with key stakeholders, ensuring a comprehensive grasp of
business imperatives. Applied my skill in collecting data using Business Intelligence (BI) tools and other application
software such as SAP Business Objects, Alteryx, and Appian to collect, analyze, and transform data into actionable
information. Additionally, I had a deep understanding of data warehousing concepts and employed advanced data
modeling techniques to structure and organize data within the data warehouse. In this project, I also leveraged dbt
(Data Build Tool) to manage and execute transformations, ensuring data modeling and analytics processes were well-
orchestrated.
 Utilizing dimensional modeling and star schema design, I ensured that data models within the data warehouse were
optimized for reporting and analytics, seamlessly integrating with Power BI for dynamic and visually engaging
reporting. dbt played a crucial role in structuring, transforming, and documenting the data pipelines in these projects.
 Played an engaged role in Program Increment (PI) planning sessions, aligning project aspirations with organizational
objectives, provided support to the Scrum Master in skilfully managing agile ceremonies, guaranteeing adherence to
sprint schedules and the effective delivery of features.
 I also worked on aligning data warehousing strategies with the overall project goals and objectives, including selecting
appropriate data modelling techniques based on business needs, with the aim of delivering impactful insights through
Power BI.
 Formulated intricate SQL queries and established robust table relationships to create a solid foundation for data
modelling and analysis. I also implemented intricate business logic based on client requirements, thereby enhancing the
depth and accuracy of data analysis. These data analysis efforts often relied on data models optimized through

2
various data modelling techniques, seamlessly integrated into Power BI for intuitive and user-friendly dashboards and
reports, with dbt ensuring the consistency and reliability of data transformations upstream.
 Consolidated diverse customer data sources using Azure Synapse Analytics, facilitating comprehensive reports and
dynamic dashboards. I also utilized Azure Machine Learning to build and deploy predictive models, enabling data-
driven decision-making at scale, with a focus on leveraging data from the well-designed data models within the data
warehouse and presented through Power BI's rich visualizations. dbt played a vital role in ensuring data consistency
and lineage throughout these complex data pipelines.
 Developed web services, SOAP message envelopes and headers using WSDL, UDDI, XML and JAXP.
 Implemented complex ETL workflows in Azure Data Factory for seamless data migration and transformation across
on-premises servers and Azure Cloud. Leveraged Azure Databricks for Python-based data transformations, advanced
analytics, and machine learning workflows, ensuring that data from data warehouses was integrated effectively into
these workflows while adhering to established data modelling techniques and making the outcomes accessible through
Power BI. the development, testing, and documentation of these ETL processes.
 Extensive experience in developing applications using Java, Ext Js, JSP, Servlets, JSP Custom Tag Libraries, JDBC,
JNDI, SQL, AJAX, JavaScript and XML.
 Collaborated seamlessly with cross-functional teams to gather intricate requirements, translating them into actionable
insights. I employed advanced SQL techniques to conduct in-depth data analysis, uncovering meaningful trends,
patterns, and anomalies that drive effective decision-making, often involving data modelled using advanced data
modelling techniques and presented through Power BI's interactive and user-friendly interface. dbt ensured that
transformations were consistent, well-documented, and easily maintainable.
 Used Pandas as API to put the data as time series and tabular format for manipulation and retrieval of data.
 Orchestrated the creation and maintenance of data connections, sources, and refresh protocols in data environments,
ensuring data freshness and availability. I also leveraged SQL optimization techniques to enhance the performance of
data models and queries, resulting in faster and more efficient reporting on data that had been structured using data
modelling techniques and visualized through Power BI's powerful capabilities. ADF contributed to the efficient
management of data transformations and automated testing, ensuring data quality and consistency.
 Conducted comprehensive training sessions, empowering business users to harness the power of Power BI and
advanced analytics for self-service insights, enabling them to create and explore their own reports and dashboards,
thereby democratizing data access and decision-making. ADF’s role in managing data pipelines ensured that the data
presented in Power BI was accurate, reliable, and up to date.
 Extracted data from Twitter using Java and Twitter API. Parsed JSON formatted twitter data and uploaded to database.
 Offered technical guidance to stakeholders, ensuring the alignment of data-driven strategies with business objectives
and maintaining data integrity, actively participated in all facets of project planning and management, ensuring timely
and high-quality solution delivery, including data warehousing projects with a focus on delivering actionable insights
through Power BI and well-managed data pipelines using ADF.
 Worked with Lightning components, LeveragedAPEX Controllerto make a call forexternal requeststo retrieve data
from various API’s and displayed them on to the component.
 Experienced in integration of Salesforce.com with external applications by usingWeb Services API,Metadata API,and
SOAP.
 Regularly attended and contributed to daily stand-up meetings, ensuring the smooth progression of tasks and efficient
communication within the team. skilfully assigned tasks to team members, aligning assignments with user stories to
optimize task management and project execution, including tasks related to data modelling, Power BI development, and
ADF pipeline management.
Environment: Python, Power BI, SQL Server, Azure Data Lake, Azure Data bricks, Azure Data Factory

Client: MERCK PHARMA, Branchburg, NJ June 2018 - Nov 2021


Role: Senior Business Analyst
Project Overview:
MERCK PHARMA is renowned in the pharmaceutical industry for its commitment to research, innovation, and healthcare
excellence. The goal of this project is to enhance the Medicare/Medicaid claims processing system to ensure compliance
with government regulations such as HIPAA (4010A1) and EDI formats. The objective is to improve the efficiency,
accuracy, and security of claims processing while adhering to accredited standards like ANSI. The project will involve
various modules of the Medicaid Management Information System (MMIS).
Responsibilities:
 Recommended changes for system design, methods, procedures, policies, and workflows affecting Medicare/Medicaid
claims processing in compliance with government compliant processes like HIPAA (4010A1) / EDI formats and
accredited standards like ANSI. Worked on various modules of MMIS.
 Organized and stored interview notes, questions, and insights using Microsoft OneNote.
 Designed and distributed surveys to gather stakeholder feedback using SurveyMonkey.
 Developed the UI Screens using HTML5, DHTML, XML, Java Scripts, Ajax, jQuery Custom-tags, JSTL DOM Layout
and CSS3.
 Worked with DBA to extract API data into SQL Server and connected to the same from Tableau for various reporting
requirements.

3
 Used SQL for querying and analysis on various source tables and wrote SQL joins, sub-queries.
 Assessed the current system documentation for the Clinical Research Collaborations (CRC) System and provided
recommendations for consolidating and updating the existing system documentation.
 Experience with HEDIS (Healthcare Effectiveness Data and Information Set) for benchmarking managed care
effectiveness and evaluating provider payer relationships.
 Used multiple data sources by blending data on a single worksheet in Tableau Desktop.
 Generated numerous BRD & FRD, use cases, system flow and workflow diagrams and provided technical support for
routine security related issues.
 Produce and explore requirements in collaboration with product owner, scrum master and project manager at the level
of detail required for Sprints.
 Used SQL Server Reporting Services (SSRS) to schedule reports to be generated on predetermined time.
 Responsible for creating ETL/metadata documents for Business users to document the mapping / lineage of
transactional and reference data elements using Microsoft Excel.
 Design and development of Web pages using HTML, CSS including Ajax controls and XML.
 Expertise in performing Bug and Defect management using bug-tracking tools like JIRA and HP QC/ALM.
 Implemented process using EDI X12 270 / 271 verifying enrolment for all sub-contractors increasing enrolment
accuracy up to 99%. Maintained the quality of data analysis, researched output, and reporting, and ensured that all
deliverables met specified requirements.
 Leveraged Tableau's advanced features to enhance agile processes, linking data and JIRA for streamlined project
management.
 Proficiently navigated Veeva Systems to streamline end-to-end clinical trial management, fostering seamless
collaboration among cross-functional teams and optimizing trial timelines.
 Conducted configuration and maintenance of EDI X12 formats related to Medicare, Medicaid Diversion and
Commercial Payors.
 Leveraged Medidata Rave for seamless management and analysis of clinical trial data, ensuring data integrity and
regulatory compliance.
 Employed SAS for advanced statistical analysis, enabling data-driven insights for optimizing healthcare processes and
outcomes.
 Responsible for providing database solutions using SQL Server.
 Experienced with Microsoft SharePoint 2010, Business Intelligence Analytics, SQL Server Integration services
SSIS, SQL Server Analysis services, Performance Point Services, ensuring Product backlog is of manageable size,
preparing Product Release Burndown charts.
 Capitalized on the dynamic capabilities of MadCap Flare to craft comprehensive and user-friendly technical
documentation, enhancing knowledge sharing and facilitating seamless project collaboration.
 Experience with EMR implementation and post-implementation support.
 Shared expert knowledge through impactful Draw.io training sessions, empowering colleagues to create compelling
visual assets.
 Developed web services, SOAP message envelopes and headers using WSDL, UDDI, XML and JAXP.
 Managed Regulatory Document Files Managing Investigator Sites Files, Form FDA 1572, Informed Consents, Clinical
Trial. Worked with 837, UB92, UB04, CMS 1500 claims and HIPAA 835, 270/271, 276/277, 278 transactions.
 Worked on EDI transactions: 270, 271, 834, 835, and 837 (P.I.D) to identify key data set elements for designated record
set. Interacted with Claims, Payments and Enrolment hence analysing and documenting related business processes.

Environment: Agile, Microsoft Visio, FACETS, EDI X12, HIPAA, Tableau, SQL, Microsoft Office Suite, Microsoft
SharePoint, Microsoft OneNote, SurveyMonkey, ETL, Lucid chart, Draw.io, Informatica, JIRA, SAP BusinessObjects, SAS,
Medidata Rave, Veeva Systems, IBM Infosphere DataStage.

Data Analyst Sep 2016 – May


2018
State Farm Insurance company
Project Overview:
The project titled "Customer Segmentation and Risk Analysis for State Farm Insurance" aims to utilize data analytics for
a comprehensive understanding of State Farm's customer base, effective customer segmentation, and thorough insurance
risk evaluation. The project involves collecting data from various sources, data preprocessing, segmenting customers
based on demographics and coverage, and constructing predictive models for risk assessment. Visualizations and
dashboards will be generated to communicate findings, leading to personalized recommendations. The advantages
encompass precise customer targeting, improved risk management, competitive edge, and potential cost reductions. This
data-centric initiative will enhance State Farm's customer insights, streamline operations, and strengthen decision-
making, benefiting both the company and its policyholders. Involved in gathering business requirements, logical
modelling, physical database design, data sourcing and data transformation, data loading, SQL, and performance tuning.

4
Used SSIS to populate data from various data sources, creating packages for different data loading operations for
applications.
Responsibilities:
 Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and
aggregation from multiple file formats.
 Extracted data from various sources like SQL Server 2016, CSV, Microsoft Excel, and Text file from Client servers.
 Performed data analytics on Data Lake using PySpark on data bricks platform.
 Designed and documented the entire Architecture of Power BI POC.
 Implementation and delivery of MSBI platform solutions to develop and deploy ETL, analytical, reporting and scorecard
/ dashboards on SQL Server using SSIS, SSRS.
 Extensively worked with SSIS tool suite, designed and created mapping using various SSIS transformations like
OLEDB command, Conditional Split, Lookup, Aggregator, Multicast and Derived Column.
 Scheduled and executed SSIS Packages using SQL Server Agent and Development of automated daily, weekly and
monthly system maintenance tasks such as database backup, Database Integrity verification, indexing and statistics
updates.
 Worked extensively on SQL, PL/SQL, and UNIX shell scripting, Expertise in creating PL/ SQL Procedures, Functions,
Triggers, and cursors.
 Loading data in No SQL database (HBase, Cassandra), Expert level knowledge of complex SQL using Teradata
functions, macros, and stored procedures.
 Developing under scrum methodology and in a CI/CD environment using Jenkin, Deploy EC2 instances for oracle
database, Utilized Power Query in Power BI to Pivot and Un-pivot the data model for data cleansing.
 Design & implement migration strategies for traditional systems on Azure (Lift and shift/Azure Migrate, other third-
party tools, used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure etc.
 Propose architectures considering cost/spend in Azure and develop recommendations to right-size data infrastructure,
Design Setup maintain Administrator the Azure SQL Database, Azure Analysis Service, Azure SQL Data warehouse,
Azure Data Factory, Azure SQL Data warehouse.
 Worked Azure SQL Database Environment, Experience working with Windows Hyper-V Server, Azure, Windows
Clustering including.
Environment: MS SQL Server 2016, ETL, SSIS, SSRS, SSMS, Cassandra, Oracle 12c, Oracle Enterprise Linux, Teradata,
Databricks, Jenkins, Power BI, Autosys, Unix Shell Scripting, Azure.

Data Analyst Sep 2013 – Aug 2016


Bank of America Financial services company
Project Overview:
This project aims to analyse customer data for Bank of America to predict churn and formulate effective retention strategies.
It involves collecting diverse customer data, conducting exploratory analysis to grasp customer behaviour, selecting
pertinent features for churn prediction, and developing predictive models. Model performance is assessed, leading to
personalized retention approaches, real-time monitoring, and testing. The project's findings and recommendations aim to
enhance customer retention, reduce costs, boost satisfaction, and provide a competitive edge in the financial sector.
Responsibilities:
 Engage in client interactions to understand and discuss reporting requirements for developing paginated reports and
Dashboard.
 Develop diverse report types, including Drilldown, drill through, Matrix, sub reports, graphical, and chart reports,
catering to different business audiences' preferences.
 Developed diverse report types, including Drilldown, Matrix, sub-reports, and graphical reports, using Azure Synapse
Analytics for data analysis and presentation.
 Collaborated with stakeholders to gather and validate reporting needs, ensuring accurate representation of data insights.
 Implemented data governance strategies to ensure data accuracy and adherence to industry regulations.
 Create and utilize custom code for enhanced functionality within Report development.
 Collaborate with stakeholders to gather and validate reporting needs, ensuring accurate and meaningful data
representation.
 Design and implement efficient data retrieval mechanisms to extract information for reports.
 Conduct testing and debugging activities to ensure report accuracy and resolve any issues promptly.
 Involved in various Data cleaning and Data transformation activities using Power BI
 Loading the into Power BI using various data sources and developing a user-friendly dashboard.
 Maintain documentation of report specifications, processes, and improvements for future reference and knowledge
sharing.
Environment: SQL Server, SSRS, Power BI, Data cleaning, Data Transformation.
Education
Bachelor of Technology from Tribhuvan University - Nepal

5
6

You might also like