etl-lead-jobs-in-khammam, Khammam

1,234 Etl Lead Jobs nearby Khammam

Toggle to save search
posted 1 week ago

GCP Technical lead

Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
experience6 to 9 Yrs
location
Hyderabad, Bangalore+3

Bangalore, Chennai, Kochi, Pune

skills
  • sql
  • security
  • python
  • gcp
  • devops
  • terraform
  • kubernates
Job Description
Job Title: GCP Technical Lead Employment Type: Permanent Industry of the Employer: IT / Software Services Department / Functional Area: Cloud Engineering, Data Engineering, DevOps Job Description Hiring for Leading MNC GCP Technical Lead Role: GCP Technical Lead Skills: GCP, Python, SQL, BigQuery, Jenkins, Terraform, CI/CD, ETL/ELT Experience: 6-9 Years Locations: Chennai, Kochi, Bangalore, Hyderabad, Pune Eligibility Criteria / Required Skills Strong experience in Python, SQL, Data Warehousing concepts, and Data Modeling Expertise in GCP services: BigQuery, Cloud Run, Pub/Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Functions Hands-on experience with Docker, Kubernetes, GitHub Strong understanding of Microservices and Serverless Architecture Ability to design scalable, secure, and cost-efficient cloud solutions Experience with Infrastructure as Code (IaC) using Terraform Knowledge of Cloud Security principles, IAM, and governance Experience with PySpark and Big Data tools Basic cloud Networking knowledge Google Professional Cloud Architect / DevOps Engineer Certification preferred Familiarity with F&A Domain is an added advantage Excellent communication and leadership skills Role Responsibilities Lead the design and architecture of end-to-end cloud solutions on GCP Oversee development of scalable ETL/ELT pipelines and cloud-native workflows Implement CI/CD pipelines using Jenkins and DevOps best practices Architect microservices and serverless-based applications Drive cloud security, performance tuning, and cost optimization Build and maintain data pipelines using BigQuery, Dataflow, Cloud Storage, Cloud Composer Guide teams through code reviews, best practices, and cloud standards Collaborate with cross-functional teams to ensure architectural alignment Ensure cloud compliance, governance, and secure architecture Keywords / Skills GCP, Python, SQL, Terraform, Jenkins, BigQuery, Cloud Composer, Pub/Sub, CI/CD, ETL, ELT, Microservices, Kubernetes, Docker, IAM, Cloud Security, Dataflow, Serverless, PySpark, Big Data Total Experience: 6 to 9 Years Salary Type: Yearly Annual Salary Offered: As per company norms Job Type: Full Time Shift Type: Day Shift / Rotational (based on project requirement) Location of the Job: Chennai | Kochi | Bangalore | Hyderabad | Pune Why Join Us Opportunity to work on cutting-edge cloud transformation projects. Collaborative and high-growth environment. Exposure to multi-cloud and hybrid cloud technologies. Leadership opportunities in shaping cloud strategy and architecture. If you are passionate about building world-class cloud solutions and want to be part of an innovative team, wed love to hear from you. Apply now!
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago

ETL Engineer

NTT DATA Services
experience6 to 10 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • PLSQL
  • ETL
  • Relational databases
  • PostgreSQL
  • Tableau
  • Power BI
  • Airflow
  • Agile Methodologies
  • Salesforcecom
  • GUSJIRA
Job Description
As an ETL Engineer at NTT DATA in Hyderabad, Telangana (IN-TG), India, you will be responsible for the following: - Having 6+ years of experience in SQL, PL/SQL, and ETL is mandatory. - Utilizing your experience with Relational databases like PostgreSQL. - Working with data visualization tools such as Tableau, Power BI, or similar platforms. - Leveraging your experience with the Salesforce.com platform and related technologies. - Demonstrating proficiency in data manipulation and analysis using SQL. - Possessing knowledge of Airflow and Tableau is considered a plus. - Implementing Agile Methodologies and being well-versed with GUS/JIRA. - Utilizing strong communication skills to effectively communicate at all levels. - Taking a proactive approach to problem-solving. - Following release management CI/CD code deployment processes to migrate code changes. - Participating in daily scrum calls and collaborating in a Global model environment. About NTT DATA: NTT DATA is a $30 billion trusted global innovator of business and technology services. Serving 75% of the Fortune Global 100, NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is one of the leading providers of digital and AI infrastructure in the world. As part of the NTT Group, they invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit NTT DATA at us.nttdata.com.,
ACTIVELY HIRING
posted 1 week ago
experience3 to 7 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Relational databases
  • Distributed systems
  • Scalability
  • Performance tuning
  • ETLELT workflows
  • NoSQL databases
  • Eventdriven messaging platforms
  • API Gateway tools
Job Description
Role Overview: As a Technical Delivery Lead at GlobalLogic, your role involves developing robust ETL/ELT workflows to ingest, transform, and load data into the data warehouse, Azure Synapse Analytics using Informatica. You will contribute to the data ecosystem by performing exploratory data analysis (EDA), ensuring data accuracy, and maintaining data integrity and reliability. Additionally, you will collaborate with data analysts and business users to translate requirements into effective data models. Key Responsibilities: - Develop robust ETL/ELT workflows using Informatica for data ingestion, transformation, and loading into Azure Synapse Analytics. - Perform exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy. - Implement proactive checks and monitoring to maintain data integrity and reliability. - Write and optimize complex SQL queries to support analytical and reporting teams. - Collaborate with data analysts and business users to understand requirements and translate them into effective data models. - Contribute to system-level design, technical planning, and story breakdowns. - Collaborate with backend, frontend, and DevOps teams to deliver integrated solutions. - Continuously learn and adopt new technologies, frameworks, and best practices. - Champion the use of AI, automation, and modern frameworks to enhance product performance and developer efficiency. - Understand the business context and build features aligning with digital product and customer experience goals. Qualification Required: - 3 to 5 years of professional experience in ETL/ELT workflows. - Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases. - Exposure to event-driven messaging platforms (Kafka or similar). - Knowledge of API Gateway tools (e.g., Kong, Zuul, Apigee, AWS API Gateway). - Solid foundation in distributed systems, scalability, and performance tuning. - Bachelor's degree in Computer Science, Software Engineering, or related field. - Experience in digital products, e-commerce platforms, or transaction systems is preferred. - Familiarity with regulatory frameworks (PCI, SOX, GDPR) is a plus. - Certifications in cloud platforms (AWS, GCP, Azure) or front-end frameworks are advantageous. - Excellent verbal and written communication skills. - Ability to translate business requirements into technical specifications and vice versa. - Strong problem-solving and analytical thinking skills. - Experience in Agile/Scrum environments is beneficial. - Comfortable presenting ideas and engaging in technical discussions. Additional Company Details: GlobalLogic is a trusted digital engineering partner to the world's largest companies, offering a culture of caring, continuous learning, interesting and meaningful work, balance, flexibility, and high-trust organization. As part of the team, you will have the opportunity to work on impactful projects, grow personally and professionally, and be part of a safe, reliable, and ethical global company.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 weeks ago
experience3 to 8 Yrs
location
Hyderabad, Telangana
skills
  • PeopleSoft
  • Workday
  • SQL
  • Application Engine
  • SQR
  • PS Query
  • data modeling
  • scripting
  • ETL processes
  • Oracle databases
  • Enterprise Interface Builder EIB
  • Workday Studio
  • Workday APIs
Job Description
You will be responsible for the technical design, development, and execution of the data migration from PeopleSoft to Workday. Your deep technical expertise in both PeopleSoft and Workday data structures and tools will be crucial in managing the technical conversion team to ensure data accuracy, integrity, and timely delivery throughout the implementation lifecycle. **Key Responsibilities:** - Participate and lead the design and architecture of the data conversion strategy, including data mapping, transformation rules, and technical specifications for extracting data from PeopleSoft HCM modules. - Design, develop, and oversee the development and execution of ETL processes using PeopleSoft tools such as Application Engine, SQR, and PS Query for data extraction. - Guide and mentor junior technical developers and analysts, providing technical oversight, code reviews, and quality assurance for all conversion deliverables. - Design and implement data validation and reconciliation processes to ensure the accuracy and completeness of converted data. - Optimize and fine-tune SQL queries, PeopleSoft processes, and Workday integrations to improve the efficiency and performance of conversion cycles. - Troubleshoot complex technical issues during the conversion process, including data load errors and reconciliation discrepancies, in collaboration with functional leads and clients. - Create and maintain comprehensive technical documentation, including data dictionaries, ETL specifications, technical designs, and process flows. - Partner with functional consultants, business stakeholders, and system integrators to translate business requirements into technical solutions and ensure alignment with the overall project plan. - Exposure to Workday tools such as Enterprise Interface Builder (EIB), Workday Studio, and Workday APIs. **Qualifications:** - Education: Bachelor's degree in Computer Science, Information Systems, or a related technical field. - Experience: 5-8+ years of hands-on technical experience with data conversion and integration, with a minimum of 3 years as a technical lead on Workday implementation projects. - Technical Skills: Expertise in ETL processes, SQL, data modeling, and scripting. - Leadership and Communication: Demonstrated experience leading and mentoring technical teams. Strong communication skills are required to interface with both technical teams and non-technical stakeholders.,
ACTIVELY HIRING
posted 2 days ago
experience3 to 10 Yrs
location
Hyderabad, Telangana
skills
  • scheduling
  • monitoring
  • troubleshooting
  • data integration
  • APIs
  • flat files
  • cleansing
  • validation
  • ETL tools administration
  • Talend Administration
  • deployment
  • data warehouse systems
  • data quality checks
  • error handling
Job Description
As an ETL Talend Administrator, your role will involve the installation, configuration, and upgrade of Talend tools in on-premises or cloud environments. You will be responsible for managing and monitoring ETL jobs, workflows, and schedules to ensure reliable data integration and pipeline execution. Troubleshooting job failures, performance issues, and connectivity errors between Talend and databases/APIs will also be a key part of your responsibilities. Additionally, you will implement and manage user roles, permissions, and security policies within Talend Administration Center (TAC). Key Responsibilities: - Install, configure, and upgrade Talend tools in various environments - Manage and monitor ETL jobs, workflows, and schedules - Troubleshoot job failures, performance issues, and connectivity errors - Implement and manage user roles, permissions, and security policies in TAC - Support deployment and migration of Talend jobs across environments - Optimize job performance through parameterization, load balancing, and resource tuning - Perform backup, recovery, and disaster recovery planning for Talend repositories - Ensure compliance with data governance, security, and quality standards - Document all administration tasks, workflows, and configurations - Collaborate with cross-functional teams to support data warehouse and integration needs Qualifications Required: - 3 to 10 years of experience in ETL tools administration, with a focus on Talend - Proficiency in Talend Administration Center (TAC) and related tasks - Strong knowledge of ETL concepts, data integration, and pipeline orchestration - Hands-on experience with databases, APIs, flat files, and data warehouse systems - Experience in data quality checks, cleansing, validation, and error handling - Understanding of server administration, resource allocation, and performance tuning The preferred qualifications for this role include a Bachelor's degree in Computer Science, Information Technology, or a related field. Experience with Talend Big Data or Talend Cloud platforms is a plus, as well as exposure to other ETL tools like Informatica or Nifi.,
ACTIVELY HIRING
posted 1 week ago

ETL /Data QA

Anblicks
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • Snowflake
  • SQL
  • Quality Assurance
  • Data Warehousing
  • Agile development
  • Regression Testing
  • ETLELT testing
  • QA Automation Frameworks
Job Description
As a Quality Assurance Engineer for data testing, your role involves designing, developing, and executing comprehensive test plans, test cases, and test scripts for ETL/ELT data pipelines. You will be responsible for validating large volumes of data in data warehouses, particularly in Snowflake environments. Your tasks will include performing end-to-end data validation, source-to-target data mapping, and regression testing. Additionally, you will need to identify, report, and track data quality issues, collaborating closely with data engineers and analysts to ensure timely resolution. Automation of data quality tests using SQL and QA automation frameworks will be a key aspect of your responsibilities. Root cause analysis of issues in ETL processes and providing actionable recommendations will also be part of your role. Participation in Agile development processes, including sprint planning, stand-ups, and retrospectives, is essential. Finally, you will provide QA sign-off for data pipeline releases and ensure delivery meets functional and non-functional requirements. - Design, develop, and execute comprehensive test plans, test cases, and test scripts for ETL/ELT data pipelines - Validate large volumes of data in data warehouses, with a strong focus on Snowflake environments - Perform end-to-end data validation, source-to-target data mapping, and regression testing - Identify, report, and track data quality issues, collaborating closely with data engineers and analysts - Automate data quality tests using SQL and QA automation frameworks - Conduct root cause analysis of issues in ETL processes and provide actionable recommendations - Participate in Agile development processes, including sprint planning, stand-ups, and retrospectives - Provide QA sign-off for data pipeline releases and ensure delivery meets functional and non-functional requirements - Minimum 5 years of experience in Quality Assurance with a focus on data testing - Strong experience in ETL/ELT testing and data warehousing concepts - Proficiency in Snowflake and SQL for data validation and analysis - Experience with test management tools and defect tracking systems - Strong analytical and problem-solving skills - Excellent communication and collaboration abilities,
ACTIVELY HIRING
posted 3 weeks ago
experience8 to 12 Yrs
location
Hyderabad, Telangana
skills
  • Informatica Power Center
  • Oracle
  • Data Warehousing
  • Dimensional modeling
  • Data governance
  • Data quality
  • Code Analysis
  • Refactoring
  • Communication
  • Collaboration
  • SDLC
  • Agile methodologies
  • Customer communication
  • Adaptive
  • ETLELT development
  • SQLPLSQL
  • DWH concepts
  • Legacy ETL pipelines
  • Cloud Experience
  • GCP data stack
  • DevOps Mindset
  • Status reporting
  • Team player
  • Proactive
Job Description
Role Overview: As a highly skilled and experienced Informatica Power Center Technical Lead, you will be responsible for ETL development using Informatica Power Center. Your role will involve strong experience in ETL/ELT development and lifecycle management, advanced proficiency in Oracle SQL/PL/SQL, data warehousing concepts, code analysis, and refactoring. You should have the ability to work independently, communicate effectively, collaborate with various stakeholders, and possess a good understanding of cloud technologies. Key Responsibilities: - Develop ETL processes using Informatica Power Center - Utilize advanced SQL/PL/SQL skills for Oracle, including performance tuning and data modeling - Apply solid understanding of data warehousing concepts, dimensional modeling, data governance, and data quality - Review and enhance legacy ETL pipelines through code analysis and refactoring - Take ownership of tasks, manage priorities, and deliver results with minimal supervision - Document designs, decisions, and operational runbooks effectively - Collaborate with data engineers, analysts, and business stakeholders - Demonstrate familiarity with GCP data stack for DWH, including BigQuery, Dataflow/Dataproc, Cloud Composer, Cloud Storage, and Looker - Embrace a DevOps mindset and work collaboratively with the team - Apply knowledge of SDLC and Agile methodologies - Communicate with customers and provide daily status reports - Exhibit strong oral and written communication skills - Work well in a team environment - Display proactive and adaptive behavior Qualifications Required: - 8 to 10 years of experience in ETL development using Informatica Power Center - Proficiency in Oracle SQL/PL/SQL, data warehousing concepts, and data modeling - Strong communication skills with the ability to collaborate effectively - Familiarity with cloud technologies, particularly GCP data stack - Understanding of SDLC and Agile methodologies - Proven track record of working independently and delivering results - Ability to analyze and refactor code for ETL pipelines - Demonstrated ability to prioritize tasks and manage workload effectively (Note: Additional details about the company were not provided in the job description.),
ACTIVELY HIRING
posted 5 days ago

ETL Lead

Caliber Technologies
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • MSBI
  • SSIS
  • SSRS
  • SQL Server
  • Oracle
  • Snowflake
  • Dimensional modeling
  • CDC
  • SCD
Job Description
Role Overview: As an experienced ETL Lead/ETL Manager at Caliber Technologies, your primary responsibility will be to lead data integration initiatives and oversee high-quality ETL development for the APQR solution. This on-site role in Hyderabad will require you to design, develop, optimize ETL workflows, mentor technical teams, and collaborate with various stakeholders to ensure the delivery of compliant and high-quality solutions. Key Responsibilities: - Design, develop, and optimize ETL workflows supporting analytics and data integration. - Lead and mentor ETL developers by conducting code reviews and providing technical guidance. - Manage project timelines, sprint planning, and effectively communicate with stakeholders. - Ensure data quality, performance tuning, and error-free job execution. - Collaborate with cross-functional teams to deliver high-quality solutions. - Support troubleshooting, root-cause analysis, and continuous improvement initiatives. - Contribute to R&D efforts aimed at enhancing data processing and APQR capabilities. Qualifications: - Bachelors or Masters degree in Computer Science, Information Systems, or a related field. - Strong analytical and problem-solving skills. - Proven experience in leading technical teams. - Hands-on experience in ETL development, data integration, and data warehousing. - Experience in technical support and issue resolution. - Familiarity with pharmaceutical or life sciences compliance (GxP) is a plus. - Strong project management skills and a sense of ownership towards delivery. About Caliber Technologies: Caliber Technologies is a leading GxP solution provider for the pharmaceutical industry, focusing on delivering innovative digital solutions for laboratories worldwide. With a commitment to supporting organizations in ensuring the safety, quality, and efficacy of their products, Caliber offers solutions spanning Laboratory Management, Manufacturing Process Automation, Quality Management, and Regulatory Compliance. Why Join Us Caliber Technologies provides you with the opportunity to work with cutting-edge digital solutions in the GxP domain. You will have a high-impact role that directly influences business growth, while enjoying a collaborative work environment that offers strong learning and development opportunities.,
ACTIVELY HIRING
posted 3 days ago

ETL Developer

Hitachi Careers
experience3 to 7 Yrs
location
Hyderabad, Telangana
skills
  • Python
  • Informatica
  • SQL
  • MongoDB
  • Cassandra
  • Kafka
  • Apigee
  • distributed systems
  • scalability
  • performance tuning
  • Azure Data Factory ADF
  • Azure Databricks
  • PySpark
  • Delta Lake
  • ETLELT
  • data pipelines
  • data lakehouse architecture
  • NoSQL databases
  • eventdriven messaging platforms
  • API Gateway tools
  • Kong
  • Zuul
  • AWS API Gateway
Job Description
Role Overview: You will work with Python Azure Data Factory (ADF), Azure Databricks, PySpark, Delta Lake, ETL/ELT, data pipelines, and data lakehouse architecture. Key Responsibilities: - Develop robust ETL/ELT workflows to ingest, transform, and load data into our data warehouse, Azure Synapse Analytics using Informatica. - Perform exploratory data analysis (EDA) to validate pipeline outputs and ensure data accuracy. - Implement proactive checks and monitoring to maintain data integrity and reliability. - Write and optimize complex SQL queries to support the data needs of analytical and reporting teams. - Collaborate with data analysts and business users to understand their requirements and translate them into effective data models. - Contribute to system-level design, technical planning, and story breakdowns. - Collaborate across backend, frontend, and DevOps teams to deliver integrated solutions. - Learn and adopt new technologies, frameworks, and best practices. - Champion the use of AI, automation, and modern frameworks to enhance product performance and developer efficiency. - Understand the business context and build features that align with digital product and customer experience goals. Qualifications Required: - 3 to 5 years of professional experience in ETL/ELT workflows. - Familiarity with NoSQL databases (e.g., MongoDB, Cassandra) and relational databases. - Exposure to event-driven messaging platforms (Kafka or similar). - Knowledge of API Gateway tools (e.g., Kong, Zuul, Apigee, AWS API Gateway). - Solid foundation in distributed systems, scalability, and performance tuning. - Bachelor's degree in Computer Science, Software Engineering, or related field. Additional Details of the Company: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, GlobalLogic has been at the forefront of the digital revolution, collaborating with clients to transform businesses and redefine industries through intelligent products, platforms, and services. GlobalLogic prioritizes a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and operates as a high-trust organization where integrity is key.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Hyderabad, Telangana
skills
  • Data Warehousing
  • Data Modeling
  • SQL
  • Snowflake
  • Oracle
  • MySQL
  • SQL Server
  • JAVA
  • Talend Data Integration
Job Description
You will be joining a dynamic team as an Associate ETL Architect at Neev Systems where your primary responsibility will involve designing, developing, and implementing data ingestion framework and data integration solutions using Talend ETL tool. Your expertise in data warehousing, data modeling, and experience with various databases will be crucial for the success of the projects. - Design, develop, and implement ETL processes using Talend Data Integration. - Collaborate with the business to understand data integration requirements. - Perform data profiling, data cleansing, and data transformation tasks. - Develop and maintain documentation related to ETL processes, data mappings, and data dictionaries. - Optimize and tune ETL processes for performance and scalability. - Troubleshoot and resolve issues related to data integration and ETL processes. - Monitor production jobs daily and fix production issues if any. - Define Talend Development standards and review the ETL process. - Design workarounds or propose solutions for any technical challenges faced by the team in ETL space. - Bachelor's degree in computer science, Information Technology, or a related field. - Proven experience as a Talend Developer. - In-depth knowledge of data integration concepts, ETL processes, and data warehousing. - Proficiency in using Talend Open Studio or Talend Data Integration for designing and implementing ETL jobs. - Strong SQL skills and experience working with various databases such as Snowflake, Oracle, MySQL, SQL Server, etc. - Excellent problem-solving and analytical skills. Exposure to JAVA code to customize Talend functionalities. - Strong communication and collaboration skills. Ability to work independently and as part of a team.,
ACTIVELY HIRING
posted 2 weeks ago

Database Lead

Adtech Corp
experience5 to 10 Yrs
location
Hyderabad, Telangana
skills
  • PostgreSQL
  • SQL Server
  • MySQL
  • Database Development
  • Data Modeling
  • Performance Optimization
  • Kubernetes
  • CICD
  • Cloud Databases
  • ETL Pipelines
  • PostgreSQL Extensions
Job Description
Role Overview: As a Database Lead at Adtech Corp, you will play a crucial role in managing, designing, and optimizing databases, ensuring their availability, performance, and scalability. Your responsibilities will include database administration, designing database structures, creating data models, and analyzing system performance to enhance operations. Collaboration with cross-functional teams to align database systems with organizational objectives will be a key aspect of your role. Key Responsibilities: - Design scalable database architectures, schemas, and data models for transactional & analytical workloads. - Develop & optimize complex stored procedures, UDFs, triggers, and queries. - Perform performance tuning, indexing, partitioning, sharding, and query plan optimization. - Implement replication, migration, and backup/restore strategies across environments. - Manage deployment scripts, versioning (Flyway/Liquibase), and CI/CD pipelines. - Utilize advanced PostgreSQL features like logical/streaming replication, JSONB, full-text search, materialized views, CTEs, and PL/pgSQL tuning. - Collaborate with backend teams (.NET / Node.js) to optimize database access and ORM performance. - Review database code, set best practices, and mentor developers. Qualifications Required: - 5-10 years of experience in database development, with 5+ years specifically in PostgreSQL (advanced concepts). - Proven experience in data modeling, architecture, debugging, and performance optimization. - Strong knowledge of partitioning, replication, sharding, and parallel queries. - Hands-on experience with CI/CD tools (Git/Jenkins/Azure DevOps) and cloud databases (AWS RDS / Aurora). - Excellent problem-solving and documentation skills. Company Description: Adtech Corp is a CMMI Level 3 appraised technology consulting firm specializing in Microsoft Dynamics 365, custom software development, cloud engineering, quality assurance, and digital business services. The company's mission is to help organizations modernize operations, optimize processes, and enhance customer experiences through innovative solutions. Committed to innovation, customer focus, and quality, Adtech Corp fosters strong partnerships to ensure long-term success for clients.,
ACTIVELY HIRING
posted 3 weeks ago
experience4 to 8 Yrs
location
Telangana
skills
  • LoadRunner
  • JMeter
  • Performance center
  • Dynatrace
  • Jira
  • Git
Job Description
Role Overview: As a Performance Testing professional with 4+ years of experience, you will be responsible for conducting performance testing activities using LoadRunner and JMeter for scripting, as well as executing tests using Performance Center and analyzing results with Dynatrace. You will also have the opportunity to work with repository tools like Jira and Git to manage test scripts and results effectively. Key Responsibilities: - Conduct performance testing using LoadRunner and JMeter for scripting - Execute performance tests using Performance Center - Analyze test results using Dynatrace - Utilize Jira and Git as repository tools for managing test scripts and results Qualifications Required: - 4+ years of experience in Performance Testing - Proficiency in LoadRunner for scripting - Familiarity with JMeter for scripting - Experience in using Performance Center for test execution - Ability to analyze test results using Dynatrace - Knowledge of repository tools such as Jira and Git Please note that the additional details of the company provided in the job description have been omitted.,
ACTIVELY HIRING
posted 3 weeks ago
experience3 to 7 Yrs
location
Hyderabad, All India
skills
  • ETL testing
  • Automation testing
  • API testing
  • Selenium
  • SQL
  • MDM testing
Job Description
As an ETL Tester at our company, your role will involve ensuring the quality and reliability of critical business data solutions. Your primary focus will be on ETL Testing, alongside engagement in automation, API, and MDM testing to support end-to-end data validation and integration. We are looking for a professional who excels in written and verbal communication and is dedicated to delivering high-quality solutions. **Key Responsibilities:** - Design, develop, and execute comprehensive ETL test cases, scenarios, and scripts to validate data extraction, transformation, and loading processes. - Collaborate with data engineers, business analysts, and QA peers to clarify requirements and ensure accurate data mapping, lineage, and transformations. - Perform functional, automation, API, and MDM testing to ensure quality assurance. - Utilize tools like Selenium for automation efforts to enhance ETL testing processes. - Identify, document, and track defects while proactively communicating risks and issues to stakeholders. - Drive continuous improvement initiatives to enhance test coverage, efficiency, and effectiveness within the ETL testing framework. - Create and maintain detailed documentation for test processes and outcomes. **Required Skills and Qualifications:** - Strong hands-on experience in ETL testing, including knowledge of ETL tools and processes. - Proficiency in automation testing using Selenium or similar frameworks. - Experience in API testing, functional testing, and MDM testing. - Excellent written and verbal communication skills to convey technical concepts clearly. - Solid analytical and problem-solving skills for troubleshooting data and process issues. - Attention to detail with a dedication to high-quality deliverables. - Ability to work effectively in a collaborative, fast-paced team environment on-site at Hyderabad. This job prefers candidates with prior experience in large-scale data environments or MDM projects, familiarity with data warehousing concepts, SQL, and data migration best practices, and holding ISTQB or related QA/testing certification. As an ETL Tester at our company, your role will involve ensuring the quality and reliability of critical business data solutions. Your primary focus will be on ETL Testing, alongside engagement in automation, API, and MDM testing to support end-to-end data validation and integration. We are looking for a professional who excels in written and verbal communication and is dedicated to delivering high-quality solutions. **Key Responsibilities:** - Design, develop, and execute comprehensive ETL test cases, scenarios, and scripts to validate data extraction, transformation, and loading processes. - Collaborate with data engineers, business analysts, and QA peers to clarify requirements and ensure accurate data mapping, lineage, and transformations. - Perform functional, automation, API, and MDM testing to ensure quality assurance. - Utilize tools like Selenium for automation efforts to enhance ETL testing processes. - Identify, document, and track defects while proactively communicating risks and issues to stakeholders. - Drive continuous improvement initiatives to enhance test coverage, efficiency, and effectiveness within the ETL testing framework. - Create and maintain detailed documentation for test processes and outcomes. **Required Skills and Qualifications:** - Strong hands-on experience in ETL testing, including knowledge of ETL tools and processes. - Proficiency in automation testing using Selenium or similar frameworks. - Experience in API testing, functional testing, and MDM testing. - Excellent written and verbal communication skills to convey technical concepts clearly. - Solid analytical and problem-solving skills for troubleshooting data and process issues. - Attention to detail with a dedication to high-quality deliverables. - Ability to work effectively in a collaborative, fast-paced team environment on-site at Hyderabad. This job prefers candidates with prior experience in large-scale data environments or MDM projects, familiarity with data warehousing concepts, SQL, and data migration best practices, and holding ISTQB or related QA/testing certification.
ACTIVELY HIRING
posted 2 weeks ago

ETL Consultant

Creeno Solutions Pvt Ltd
experience4 to 8 Yrs
location
Hyderabad, All India
skills
  • ETL
  • Talend
  • Oracle SQL
  • Excel
  • SQL scripting
  • data mapping
  • Agile
  • Waterfall
  • communication
  • Dell Boomi
  • csv files
  • data applicationssystems
  • software development lifecycle SDLC
  • cloud
  • application integrations
  • problemsolving
Job Description
You have a job opportunity as an ETL Consultant with 4+ years of experience in a Full Time role based in Hyderabad. As an ETL Consultant, your role involves assessing data and analytic requirements to establish mapping rules from source to target systems in order to meet business objectives. You will work with real-time, batch, and ETL processes for complex data conversions. Your responsibilities will include utilizing ETL methodologies and tools such as Talend, Dell Boomi, etc. to prepare data for data loads based on target system specifications. Strong SQL scripting experience is required for this role. Key Responsibilities: - Assess data and analytic requirements to establish mapping rules from source to target systems - Work with real-time, batch, and ETL processes for complex data conversions - Utilize ETL methodologies and tools such as Talend, Dell Boomi, etc. - Prepare data for data loads based on target system specifications - Use various data applications/systems such as Oracle SQL, Excel, .csv files, etc. - Communicate with clients and/or ISW Project Manager to scope, develop, test, and implement conversion/integration - Collaborate via phone and email with clients and/or ISW Project Manager throughout the conversion/integration process - Drive improvements in the data migration process - Work independently, prioritize tasks, and manage multiple tasks simultaneously - Ensure clients data is converted/integrated accurately and within deadlines established by ISW Project Manager - Experience in customer SIT, UAT, migration, and go-live support Qualifications Required: - Strong SQL scripting experience - Working knowledge of software development lifecycle (SDLC) methodologies including Agile, Waterfall, and others - Clear understanding of cloud and application integrations - Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously - Demonstrated collaboration and problem-solving skills If you are interested in this position, please apply as per the details provided. You have a job opportunity as an ETL Consultant with 4+ years of experience in a Full Time role based in Hyderabad. As an ETL Consultant, your role involves assessing data and analytic requirements to establish mapping rules from source to target systems in order to meet business objectives. You will work with real-time, batch, and ETL processes for complex data conversions. Your responsibilities will include utilizing ETL methodologies and tools such as Talend, Dell Boomi, etc. to prepare data for data loads based on target system specifications. Strong SQL scripting experience is required for this role. Key Responsibilities: - Assess data and analytic requirements to establish mapping rules from source to target systems - Work with real-time, batch, and ETL processes for complex data conversions - Utilize ETL methodologies and tools such as Talend, Dell Boomi, etc. - Prepare data for data loads based on target system specifications - Use various data applications/systems such as Oracle SQL, Excel, .csv files, etc. - Communicate with clients and/or ISW Project Manager to scope, develop, test, and implement conversion/integration - Collaborate via phone and email with clients and/or ISW Project Manager throughout the conversion/integration process - Drive improvements in the data migration process - Work independently, prioritize tasks, and manage multiple tasks simultaneously - Ensure clients data is converted/integrated accurately and within deadlines established by ISW Project Manager - Experience in customer SIT, UAT, migration, and go-live support Qualifications Required: - Strong SQL scripting experience - Working knowledge of software development lifecycle (SDLC) methodologies including Agile, Waterfall, and others - Clear understanding of cloud and application integrations - Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously - Demonstrated collaboration and problem-solving skills If you are interested in this position, please apply as per the details provided.
ACTIVELY HIRING
posted 6 days ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • API testing
  • Load testing
  • UI testing
  • QA tools
  • Test management tools
  • Communication skills
  • Stakeholder management
  • Mobile Automation tools
  • QA methodologies
  • QA processes
  • Problemsolving
  • CICD pipelines
  • Cloudbased testing platforms
Job Description
As a Quality Assurance Engineer at Techdome, you will be responsible for ensuring that our products meet the highest quality standards. You should be a highly skilled and experienced QA Lead capable of taking ownership of projects independently from day one. Your expertise in end-to-end testing, proficiency in handling multiple testing domains (API, Load, UI, Mobile), and strong leadership skills are essential for driving quality processes effectively. **Key Responsibilities:** - Lead QA activities across API, Load, UI, and Mobile testing. - Own the testing process and ensure delivery with minimal supervision. - Define, implement, and maintain quality standards and best practices. - Conduct database testing and integrate CI/CD pipelines with Jenkins. - Collaborate with development teams to identify and resolve defects promptly. - Mentor junior QA team members and provide guidance to interns/associates. - Ensure effective utilization of test automation frameworks. - Take complete ownership of testing deliverables within the project timeline. **Qualification Required:** - Strong hands-on experience in API testing, Load testing, and UI testing. - Proficiency in Mobile Automation tools such as Appium/APM. - Ability to work independently and take end-to-end responsibility. - Sound knowledge of QA methodologies, processes, and tools. - Excellent problem-solving skills and a process-oriented mindset. **Additional Details:** At Techdome, we expect you to be a self-driven professional capable of handling a team without dependency. You should thrive in a fast-paced, project-driven environment and be committed to delivering high-quality software products. Tools & Technologies: - API Testing Tools: Postman, RestAssured. - UI / Functional Testing Tools: Selenium WebDriver, Playwright, Cypress. - Automation Frameworks & Languages: Java, Python, TestNG, JUnit, Maven. - Database Testing: SQL. - Performance & Load Testing Tools: JMeter, K6, Locust, LoadRunner. - Version Control & CI/CD: Git, GitHub/GitLab, Jenkins, Azure DevOps, Docker, Kubernetes (basic knowledge preferred). - Defect & Test Management Tools: JIRA. (ref:hirist.tech),
ACTIVELY HIRING
posted 1 month ago
experience5 to 10 Yrs
location
Telangana
skills
  • SQL queries
  • Stored Procedures
  • Query Optimization
  • Performance Tuning
  • SSIS
  • Data Integration
  • Power BI
  • SSRS
  • Database Design
  • MSSQL Developer
  • Database Developer
  • Functions
  • Database Indexing
  • ETL Processes
  • Normalization Techniques
  • Relational Database Management Systems
Job Description
As a Database Tech Lead, you will play a crucial role in leveraging your 5+ years of Professional Experience as a MSSQL Developer or Database Developer to lead database-related projects. Your responsibilities will include: - Proficiency in writing complex SQL queries, stored procedures, and functions. - Strong expertise in query optimization, performance tuning, and database indexing. - Knowledge or experience in Duck Creek Data Insights. - Experience with ETL Processes (SSIS) and data integration techniques. - Experience with Power BI or SSRS. - Familiarity with database design principles and normalization techniques. - Hands-on experience with one or more relational database management systems (e.g., SQL Server, Oracle, PostgreSQL, MySQL) is a plus. - Excellent problem-solving skills and attention to detail. - Strong communication and teamwork skills. - Leadership experience and the ability to mentor junior developers is a plus. You should possess 7-10 years of work experience and be located in Hyderabad for this role, which requires working from the office.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • Data warehousing
  • Snowflake
  • SQL
  • ETLELT testing
  • QA automation frameworks
  • Agile development processes
Job Description
As a Quality Assurance Engineer for data testing, your role will involve designing, developing, and executing comprehensive test plans, test cases, and test scripts for ETL/ELT data pipelines. You will be responsible for validating large volumes of data in data warehouses, with a particular focus on Snowflake environments. Your tasks will include performing end-to-end data validation, source-to-target data mapping, and regression testing. Additionally, you will need to identify, report, and track data quality issues, collaborating closely with data engineers and analysts to ensure timely resolution. Automation of data quality tests using SQL and QA automation frameworks will also be a part of your responsibilities. In case of issues in ETL processes, you will conduct root cause analysis and provide actionable recommendations. Your involvement in Agile development processes, such as sprint planning, stand-ups, and retrospectives, will be crucial. You will provide QA sign-off for data pipeline releases and ensure that delivery meets functional and non-functional requirements. Qualifications Required: - Minimum 5 years of experience in Quality Assurance with a focus on data testing. - Strong experience in ETL/ELT testing and data warehousing concepts. - Proficiency in Snowflake and SQL for data validation and analysis. - Experience with test management tools and defect tracking systems. - Strong analytical and problem-solving skills. - Excellent communication and collaboration abilities.,
ACTIVELY HIRING
posted 3 weeks ago

Databricks Lead Engineer

Vy Systems Private Limited
experience8 to 13 Yrs
Salary50,000 - 3.5 LPA
WorkRemote
location
Hyderabad, Bangalore+5

Bangalore, Chennai, Gurugram, Kolkata, Mumbai City, Delhi

skills
  • aws
  • azure
  • databricks
Job Description
Job Title: Databricks LeadLocation: RemoteExperience: 8+ YearsApply: sanjai@vysystems.comAbout the RoleWe are seeking a skilled Databricks Developer with hands-on experience in Apache Spark, PySpark, AWS and data engineering on the Azure Databricks platform. The ideal candidate will design and implement scalable data pipelines, optimize data processing workflows, and contribute to building robust data solutions supporting analytics and business intelligence initiatives.Key ResponsibilitiesDesign, develop, and maintain data pipelines and ETL workflows using Databricks (PySpark, SQL, Scala, or Python).Develop scalable and optimized data processing solutions for large datasets.Work with Azure Data Lake, Delta Lake, and Azure Data Factory (ADF) to build end-to-end data solutions.Implement data transformations, aggregations, and cleansing within Databricks.Collaborate with data architects, analysts, and business stakeholders to translate requirements into technical designs.Optimize Spark jobs for performance and cost efficiency.Monitor, debug, and troubleshoot Databricks jobs and clusters.Ensure best practices for data governance, quality, and security.Contribute to CI/CD pipelines for data workflows and infrastructure automation.Document processes, workflows, and code to ensure maintainability.
posted 1 week ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Data Warehousing
  • Snowflake
  • GCP
  • AWS
  • Azure
  • Python
  • SDLC
  • Agile methodologies
  • Informatica ETL Development
  • Code Analysis
  • Refactoring
  • Databricks
  • BigQuery
  • Pyspark
Job Description
As a highly skilled and experienced Informatica ETL Development Lead Programmer Analyst, you will be responsible for the following key responsibilities: - Strong experience with ETL/ELT development and lifecycle management using Informatica Power Center. - Advanced proficiency in SQL/PL/SQL, including performance tuning and data modeling. - Solid understanding of Data Warehousing concepts, dimensional modeling, data governance, and data quality. - Ability to review and refactor legacy ETL pipelines. - Willingness to adopt new technologies and work in a cloud environment. - Experience working with Databricks, Snowflake, or BigQuery. - Familiarity with cloud platforms such as GCP, AWS, or Azure. - Working knowledge of Python or Pyspark. - Understanding of SDLC and Agile methodologies. - Communication with customers and producing daily status reports. - Strong oral and written communication skills. - Ability to work well in a team and be proactive and adaptive. No additional details about the company were provided in the job description.,
ACTIVELY HIRING
posted 3 weeks ago
experience6 to 10 Yrs
location
Hyderabad, All India
skills
  • Testing
  • Data Engineering
  • Python
  • Java
  • SQL
  • Data warehousing
  • Selenium
  • Jenkins
  • AWS
  • Azure
  • Hadoop
  • Kafka
  • Docker
  • Kubernetes
  • Performance testing
  • Monitoring tools
  • Git
  • Agile methodologies
  • Functional testing
  • Regression testing
  • Verbal communication
  • Written communication
  • Analytical skills
  • NoSQL databases
  • ETL processes
  • QA activities
  • Automated test frameworks
  • Data pipelines
  • Google Cloud
  • Data Technologies
  • Data Bricks
  • PySpark
  • CICD pipelines
  • DevOps practices
  • Containerization technologies
  • Version control systems
  • DevOps methodologies
  • Test cases creation
  • Defects creation
  • Analyzing root cause
  • Problemsolving skills
Job Description
As an experienced candidate in testing, particularly in data engineering, you will be responsible for the following key areas: - Strong coding skills in Python/Java - Proficiency in SQL and NoSQL databases - Hands-on experience in data engineering, ETL processes, and data warehousing QA activities - Design and develop automated test frameworks for data pipelines and ETL processes - Use tools like Selenium, Jenkins, and Python to automate test execution - Experience with cloud platforms such as AWS, Azure, or Google Cloud - Familiarity with data technologies like Data Bricks, Hadoop, PySpark, and Kafka - Understanding of CI/CD pipelines and DevOps practices - Knowledge of containerization technologies like Docker and Kubernetes - Experience with performance testing and monitoring tools - Familiarity with version control systems like Git - Exposure to Agile and DevOps methodologies - Experience in test cases creation, functional and regression testing, defects creation and analyzing root cause - Good verbal and written communication, analytical, and problem-solving skills - Ability to work with team members globally (US, Taiwan, India, etc.) In this role, your principal duties and responsibilities will include: - Managing project priorities, deadlines, and deliverables with minimal supervision - Serving as a technical lead on a subsystem or small feature - Communicating with project leads to make recommendations about overcoming obstacles - Anticipating complex issues and discussing them within and outside the project team - Identifying test scenarios, overseeing test execution, and providing QA results to the business - Assisting and mentoring other team members for training and performance management purposes You are expected to work under some supervision, take responsibility for your work, and make decisions that have a moderate impact. You will need to convey complex information to multiple audiences, have a moderate amount of influence over key decisions, and exercise creativity in drafting original work products. Minimum Qualifications: - 6-8 years of proven experience in testing, particularly in data engineering Preferred Qualifications: - 10+ years of QA/testing experience - Strong coding skills in Python/Java - Proficiency in SQL and NoSQL databases Qualcomm is an equal opportunity employer and is committed to providing accommodations for individuals with disabilities during the application/hiring process. If you have a disability and need an accommodation, you can contact Qualcomm for support. As an experienced candidate in testing, particularly in data engineering, you will be responsible for the following key areas: - Strong coding skills in Python/Java - Proficiency in SQL and NoSQL databases - Hands-on experience in data engineering, ETL processes, and data warehousing QA activities - Design and develop automated test frameworks for data pipelines and ETL processes - Use tools like Selenium, Jenkins, and Python to automate test execution - Experience with cloud platforms such as AWS, Azure, or Google Cloud - Familiarity with data technologies like Data Bricks, Hadoop, PySpark, and Kafka - Understanding of CI/CD pipelines and DevOps practices - Knowledge of containerization technologies like Docker and Kubernetes - Experience with performance testing and monitoring tools - Familiarity with version control systems like Git - Exposure to Agile and DevOps methodologies - Experience in test cases creation, functional and regression testing, defects creation and analyzing root cause - Good verbal and written communication, analytical, and problem-solving skills - Ability to work with team members globally (US, Taiwan, India, etc.) In this role, your principal duties and responsibilities will include: - Managing project priorities, deadlines, and deliverables with minimal supervision - Serving as a technical lead on a subsystem or small feature - Communicating with project leads to make recommendations about overcoming obstacles - Anticipating complex issues and discussing them within and outside the project team - Identifying test scenarios, overseeing test execution, and providing QA results to the business - Assisting and mentoring other team members for training and performance management purposes You are expected to work under some supervision, take responsibility for your work, and make decisions that have a moderate impact. You will need to convey complex information to multiple audiences, have a moderate amount of influence over key decisions, and exercise creativity in drafting original work products. Minimum Qualifications: - 6-8 years of proven experience in testing, particularly in data engineering Preferred Qualifications: - 10+ years of QA/testing experience - Strong coding skills in Python/Java - Proficiency in SQL and NoSQL databases Qualcomm is an equal opportunity employer and is committed to providing accommodations for individuals with disabilities during the applic
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter