0% found this document useful (0 votes)
21 views3 pages

ETL Roles and Responciblity

The document outlines the essential steps for ETL testing, including understanding requirements, raising clarifications, and preparing test plans and cases. It emphasizes the importance of executing test cases, raising defects, and participating in defect triage, as well as tracking defects and providing regular status updates. Finally, it concludes with the process of giving signoff once testing is complete and all requirements are met.

Uploaded by

mmyybabybaby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views3 pages

ETL Roles and Responciblity

The document outlines the essential steps for ETL testing, including understanding requirements, raising clarifications, and preparing test plans and cases. It emphasizes the importance of executing test cases, raising defects, and participating in defect triage, as well as tracking defects and providing regular status updates. Finally, it concludes with the process of giving signoff once testing is complete and all requirements are met.

Uploaded by

mmyybabybaby
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

1.

Understanding Requirements:
● Explanation: The tester must thoroughly understand the business
requirements and the technical specifications of the ETL process. This
includes understanding the source data, the transformations that need to
be applied, and the expected output in the target system.
● Example: Imagine a company wants to consolidate sales data from
multiple regional databases into a central data warehouse. The ETL
tester needs to understand what data is considered "sales data," how it's
defined in each region, and what kind of reports and analyses will be
generated from the data warehouse.

2. Raising Clarifications:
● Explanation: If the requirements or specifications are unclear or
incomplete, the tester must raise questions and seek clarification from
the business analysts, developers, or other stakeholders.
● Example: If the requirement states "cleanse customer addresses," the
tester might ask: "What specific types of address errors need to be
corrected? What is the standard address format we should use?"

3. Test Plan Preparations:


● Explanation: The tester creates a test plan that outlines the scope of
testing, the testing strategy, the test environment, the test data, and the
schedule.
● Example: A test plan might include sections on:
○ Scope: Testing the loading of sales data for the past year.
○ Strategy: Using a combination of data validation, boundary testing,
and performance testing.
○ Test Data: Creating a set of test data that includes valid and invalid
sales records.

4. Test Case Preparation with SQL Queries Based on Technical


Design or Mapping Document:
● Explanation: The tester designs detailed test cases that cover all
aspects of the ETL process. These test cases often involve writing SQL
queries to validate the data transformations and the data in the target
system.
● Example:
○ Test Case: Verify that the "total sales" in the data warehouse is the
sum of "sales amount" from the source databases.
○ SQLQuery:
SQL
SELECT SUM(sales_amount) FROM source_database.sales_table;
SELECT SUM(total_sales) FROM data_warehouse.sales_fact_table;
-- Compare the results of the two queries.

5. Reviewing Test Cases and SQL Queries:


● Explanation: The tester's test cases and SQL queries are reviewed by
other testers or developers to ensure they are accurate, comprehensive,
and cover all the requirements.
● Example: A peer reviewer might check that the SQL queries are
syntactically correct and that the test cases cover all the edge cases and
boundary conditions.

6. Executing Test Cases:


● Explanation: The tester executes the test cases in the test environment
and records the results.
● Example: The tester runs the SQL queries from the test cases, compares
the results with the expected results, and notes any discrepancies.

7. Raising Defects:
● Explanation: If the actual results do not match the expected results, the
tester raises a defect and provides detailed information about the issue.
● Example: The tester finds that the "total sales" in the data warehouse is
incorrect for a specific region. They raise a defect with the following
information:
○ Defect ID: ETL-123
○ Description: Total sales for Region A are incorrect.
○ Steps to Reproduce: Run the SQL queries from test case TC-005.
○ Expected Result: Total sales should match the sum of sales
amounts from the source database.
○ Actual Result: Total sales are lower than the sum of sales amounts.

8. Active Participant in Defect Triage Call:


● Explanation: The tester participates in defect triage meetings to discuss
and prioritize the defects with the developers and other stakeholders.
● Example: The tester explains the details of the defect, answers
questions from the developers, and helps to determine the severity and
priority of the defect.
9. Tracking Defects:
● Explanation: The tester tracks the status of the defects and ensures
that they are resolved in a timely manner.
● Example: The tester uses a defect tracking tool to monitor the progress
of the defects and follows up with the developers if necessary.

10. Updating and Sharing Regular Status:


● Explanation: The tester provides regular status updates to the project
manager and other stakeholders.
● Example: The tester sends a daily or weekly report that includes
information on the number of test cases executed, the number of defects
found, and the overall status of the testing effort.

11. Giving Signoff:


● Explanation: Once all the test cases have been executed and all the
defects have been resolved, the tester provides a signoff to indicate that
the ETL process is ready for production.
● Example: The tester sends an email to the project manager stating that
the testing is complete and the ETL process has met all the
requirements.

Let me know if you would like any of these points explained further!

You might also like