etl-developer-jobs-in-faridabad, Faridabad

951 Etl Developer Jobs in Faridabad

Toggle to save search
posted 2 months ago

Talend Developer

Venpa Global Technologies Private Limited
experience6 to 9 Yrs
Salary12 LPA
location
Noida, Bangalore+4

Bangalore, Chennai, Hyderabad, Gurugram, Pune

skills
  • tac
  • etl
  • data
  • soap
  • studio
  • rest
  • talend
  • sql
  • integration
  • migration
  • overview
  • talend developer
Job Description
Talend Developer   (6 Years experience)  Upgrading Talend from 7.x to 8.0.1 involves not just infrastructure changes, but also project-level upgrades that require Talend Developers to adapt, refactor, and validate their ETL jobs. Below is a detailed overview of the skills, roles, and responsibilities of Talend Developers during this upgrade process: Key Skills for Talend Developers (Upgrade 7.3 8.x) Talend Studio Expertise Proficiency in Talend Studio for Data Integration/Big Data/ESB (based on edition). Familiarity with job designs, components, contexts, routines, and metadata repositories. Knowledge of repository-based project management and shared resources. Component Compatibility and Migration Understanding of changes to components and deprecated features in 8.x. Ability to replace deprecated components with newer alternatives. Experience in upgrading custom routines, tJava code blocks, and external JARs. Experience with Git, SVN, or TAC integrated version control. Knowledge of the Talend CommandLine (CI Builder) for automated builds in 8.x. Testing and Validation Expertise in unit testing, job-level validation. Skill in comparing job outputs and logs across 7.x and 8.x. Debugging and resolving issues caused by API, database driver, or job engine changes. Database and API Knowledge SQL scripting for data validation and comparison. Understanding of any REST/SOAP API calls used in jobs. Familiarity with data quality, data cleansing, and transformation logic. Roles and Responsibilities of Talend Developers Pre-Upgrade Assessment Review existing jobs for usage of deprecated or risky components. Tag jobs that need refactoring or heavy testing. Export and back up project sources from Talend Studio 7.x.  Project Upgrade Execution Open and migrate projects using Talend Studio 8.x. Resolve upgrade errors and component mapping warnings. Replace or reconfigure any unsupported or changed features.  Post-Upgrade Job Testing Performancetesting to ensure job outputs remain accurate. Compare run times and log outputs for performance or logic issues. Validate dependencies (e.g., database connectors, external services). Documentation and Collaboration Document all changes made to migrated jobs. Work with admins to troubleshoot TAC scheduling or job execution issues. Communicate with QA and data teams for test case validation. Implement reusable jobs and promote modular design using best practices. Recommend improvements in job design and monitoring post-upgrade.
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 weeks ago

Abinitio Developer

CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
experience5 to 10 Yrs
location
Delhi, Noida+7

Noida, Bangalore, Chennai, Hyderabad, Gurugram, Kolkata, Pune, Mumbai City

skills
  • ab initio
  • unix shell scripting
  • sql
Job Description
Key Responsibilities Design, develop, and implement ETL processes using Ab Initio GDE (Graphical Development Environment). Build and maintain Ab Initio graphs, plans, and sandboxes for data extraction, transformation, and loading. Work with business teams to understand data integration requirements and deliver efficient solutions. Use Ab Initio EME for version control, dependency management, and metadata governance. Perform data profiling, data validation, and quality checks using Ab Initio components and tools. Optimize ETL workflows for performance, scalability, and maintainability. Implement robust error handling, restartability, and logging mechanisms. Collaborate with DBAs, data modelers, and analysts to ensure data accuracy and consistency. Schedule and monitor jobs using Ab Initio Control Center (AICC) or enterprise schedulers. Support production systems, troubleshoot issues, and perform root cause analysis. Required Technical Skills Strong hands-on experience in Ab Initio GDE, EME, Co>Operating System, and Control Center. Proficiency with Ab Initio components such as Input/Output, Transform, Partition, Sort, Join, Lookup, Rollup, Reformat, Scan, and Dedup Sort, along with error handling using Rejects, Error Tables, and Error Ports for robust ETL design. Expertise in ETL design, development, and deployment for large-scale data environments. Proficiency in SQL and relational databases such as Oracle, Teradata, DB2, or SQL Server. Experience with UNIX/Linux shell scripting for automation and workflow integration. Understanding of data warehousing concepts (star schema, snowflake schema, slowly changing dimensions). Strong performance tuning and debugging skills in Ab Initio. Familiarity with data quality, metadata management, and data lineage.  
INTERVIEW ASSURED IN 15 MINS
posted 2 days ago

SSE- QA ETL

CloudSufi
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • test automation
  • data validation
  • Selenium
  • GitHub
  • Power BI
  • SQL
  • Jira
  • Confluence
  • SharePoint
  • Postman
  • GitHub Actions
  • Azure environments
  • Databricks
  • Azure Functions
  • AgileScrum delivery models
  • AI tools for QA
  • Gitbased repositories
Job Description
As a QA Engineer at CLOUDSUFI, a Google Cloud Premier Partner, your role will be to enable a shift-left testing approach by integrating QA early into the development lifecycle. This will ensure data integrity, system reliability, and continuous delivery alignment within a cloud-first environment. You will work closely with Data Engineering, Analytics, and Development teams to deliver measurable improvements in quality, automation coverage, and defect prevention. **Key Responsibilities:** - **Automated Test Framework Development:** Design and implement reusable, scalable automation frameworks covering ETL pipelines, APIs, UI, and back-end data systems using Selenium and related tools. - **End-to-End Test Coverage:** Provide comprehensive manual and automated test coverage across Dev, QA, and Production environments to validate all data processing and reporting layers. - **Data Validation & Quality Assurance:** Conduct deep data validation across Databricks, Power BI, and Azure data flows to ensure accuracy, completeness, and consistency. Implement automated ETL testing to detect data issues early. - **CI/CD Integration:** Integrate automated testing into CI/CD pipelines using GitHub Actions to support continuous testing, deployment validation, and faster release cycles. - **QA Process Standardization:** Apply standardized QA processes, templates, and governance models aligned with enterprise QA best practices. - **Jira-Based QA Management:** Utilize Jira for complete QA lifecycle management, including test planning, case management, defect tracking, and AI-assisted test creation. Use Confluence for documentation and traceability. - **AI-Augmented Testing:** Utilize AI tools within Jira to automate test design, optimize test case selection, and improve overall test efficiency and coverage. - **Reporting & Visibility:** Provide regular QA status updates, risk assessments, and mitigation plans. Maintain dashboards for test coverage, automation ROI, and defect trends. - **Performance & Resilience Testing:** Conduct performance and load testing where applicable to ensure reliability of data services. Support DR testing and validation of environment stability. **Qualifications Required:** - Strong knowledge of test automation and data validation. - Expertise in Selenium, Postman, GitHub, and GitHub Actions for CI/CD testing automation. - Hands-on experience testing in Azure environments, including Databricks, Azure Functions, and Power BI. - Strong SQL proficiency for data analysis, test data management, and validation. - Experience executing tests across Agile/Scrum delivery models. - Familiarity with AI tools for QA, Jira, Confluence, SharePoint, and Git-based repositories. In addition to the technical skills, as a QA Engineer at CLOUDSUFI, you are expected to possess the following soft skills and attributes: - Strong analytical, documentation, and communication skills. - Detail-oriented and highly organized with an automation-first mindset. - Proactive, self-driven, and capable of operating independently in a fast-paced environment. - Committed to continuous improvement and measurable quality outcomes. Please note that the job also requires the following behavioral competencies: - Experience working with US/Europe based clients in onsite/offshore delivery model. - Excellent verbal and written communication, technical articulation, listening, and presentation skills. - Proven analytical and problem-solving skills. - Effective task prioritization, time management, and internal/external stakeholder management skills. - Quick learner and team player. - Experience of working under stringent deadlines in a Matrix organization structure. - Demonstrated appreciable Organizational Citizenship Behavior (OCB) in past organizations.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

ETL Developer

BirlaSoft
experience2 to 6 Yrs
location
Noida, Uttar Pradesh
skills
  • Financial Services
  • SQL
  • Python
  • AWS
  • GCP
  • DevOps
  • Tableau
  • Communications
  • Insurance domain
  • ETL pipeline design
Job Description
Role Overview: As an ETL Developer at Birlasoft, you will be responsible for designing, implementing, and maintaining ETL pipelines. Your role will involve analyzing data, identifying deliverables, and ensuring consistency in the processes. You will play a crucial part in enhancing business processes for customers and contributing to the growth of sustainable communities. Key Responsibilities: - Analyze data to identify deliverables, gaps, and inconsistencies - Design, implement, and maintain ETL pipelines - Work with AWS, GCP, and infrastructure-as-code systems - Utilize SQL and Python for more than 2 years - Collaborate with visualization tools, with preference for Tableau - Demonstrate critical thinking and problem-solving abilities - Manage multiple task assignments independently - Apply strong communication skills, both verbal and written - Prioritize tasks effectively in a fast-paced environment - Uphold a commitment to customer service and company growth - Exhibit superior analytical skills Qualifications Required: - Strong communication skills, both verbal and written - More than 2 years of experience in SQL and Python - Experience in ETL pipeline design and maintenance - Familiarity with AWS, GCP, and infrastructure-as-code systems - Ability to work independently and manage multiple task assignments efficiently - Nice to have experience in Financial Services and Insurance domain - Superior analytical and problem-solving abilities (Note: The additional details about Birlasoft's approach and commitment to building sustainable communities have been omitted as they were not directly related to the job role.),
ACTIVELY HIRING
posted 1 month ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Benefits administration
  • Data mapping
  • Interface design
  • MSSQL
  • HCM systems
  • ETL processes
Job Description
As a Consultant at UKG, you will be responsible for designing, configuring, and implementing Benefits solutions within the UKG Pro platform. Your expertise in MSSQL for reporting, interfaces, and troubleshooting, along with in-depth domain knowledge of Benefits administration and integration, will be crucial for success. Key Responsibilities: - Understand customer requirements related to Benefits setup, eligibility, enrollment, and compliance. - Design, configure, and implement Benefits modules within UKG Pro. - Create and maintain MSSQL queries, stored procedures, and reports to support Benefits processing and integrations. - Develop inbound/outbound interfaces with third-party Benefits providers. - Prepare and maintain design documents, configuration guides, and test scripts. - Support User Acceptance Testing (UAT) and resolve issues during test cycles. - Deploy solutions to production environments and provide transition-to-support documentation. - Upgrade clients to newer UKG Pro releases and ensure Benefits functionality is delivered seamlessly. - Collaborate with customers during evening calls (as needed) to align with US time zones and project schedules. - Support major Benefits releases, open enrollment cycles, and compliance updates. Qualifications: - Bachelor's degree or equivalent in Computer Science, Information Systems, or a related field. - Strong expertise with MSSQL (queries, stored procedures, performance tuning, troubleshooting). - Hands-on experience configuring Benefits modules within HCM systems (preferably UKG Pro). - Solid understanding of Benefits administration (eligibility rules, enrollment, open enrollment, COBRA, ACA, 401k, etc.). - Experience integrating with third-party providers for Benefits (healthcare, retirement, insurance vendors). - Knowledge of data mapping, ETL processes, and interface design. - Excellent oral and written communication skills. - Strong analytical, problem-solving, and customer-facing skills. At UKG, you will be part of a team that is on a mission to inspire every organization to become a great place to work through award-winning HR technology. With a focus on diversity and inclusion, UKG is committed to promoting a collaborative and inclusive workplace environment.,
ACTIVELY HIRING
posted 2 months ago

Snowflake Developer

Viraaj HR Solutions Private Limited
experience3 to 7 Yrs
location
Delhi
skills
  • cloud computing
  • data modeling
  • etl tools
  • analytical skills
  • data integration
  • azure
  • data governance
  • snowflake
  • metadata management
  • troubleshooting
  • communication
  • team collaboration
  • data warehousing
  • performance tuning
  • sql proficiency
  • cloud platforms aws
  • google cloud
  • problemsolving
Job Description
Role Overview: Viraaj HR Solutions, a dynamic HR consultancy, is seeking a skilled Snowflake Developer to design, develop, and implement data solutions using the Snowflake platform. As part of the team, you will optimize and manage Snowflake databases, develop data integration workflows, and ensure data quality and integrity across all data stores. Your role will involve collaborating with data scientists, conducting system performance tuning, and staying updated on Snowflake features and best practices. Key Responsibilities: - Design, develop, and implement data solutions using the Snowflake platform. - Optimize and manage Snowflake databases and schemas for efficient data access. - Develop data integration workflows using ETL tools with Snowflake. - Implement and maintain data models to support business analytics. - Prepare and optimize SQL queries for performance improvements. - Collaborate with data scientists and analysts to address data-related issues. - Monitor Snowflake usage and provide recommendations for resource management. - Ensure data quality and integrity across all data stores. - Document data architecture and design processes for future reference. - Conduct system performance tuning and troubleshooting of the Snowflake environment. - Integrate Snowflake with cloud-based services and tools as required. - Participate in code reviews and provide constructive feedback. - Stay updated on Snowflake features and industry best practices. - Assist in training team members on Snowflake capabilities and tools. - Work closely with stakeholders to gather requirements and define project scope. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - At least 3 years of experience in data engineering or database development. - Strong expertise in Snowflake platform; certification is a plus. - Proficient in SQL with hands-on experience in complex queries. - Experience with ETL tools and data integration techniques. - Understanding of data warehousing concepts and best practices. - Familiarity with cloud platforms like AWS, Azure, or Google Cloud. - Ability to troubleshoot and solve data-related issues. - Excellent analytical and problem-solving skills. - Strong communication and team collaboration skills. - Experience in data modeling and metadata management. - Ability to work independently and manage multiple tasks effectively. - Knowledge of additional programming languages is a plus. - Understanding of data governance practices. - Willingness to learn and adapt to new technologies.,
ACTIVELY HIRING
posted 2 weeks ago
experience4 to 10 Yrs
location
Noida, All India
skills
  • SQL queries
  • Stored procedures
  • Data warehousing
  • Analytical skills
  • Communication skills
  • Performance tuning
  • Data quality
  • Data governance
  • Version control
  • Azure
  • ETL development
  • SQL Server Integration Services SSIS
  • Problemsolving skills
  • Deployment process
  • Cloud platforms AWS
Job Description
You will be working as an ETL Developer specializing in SSIS (SQL Server Integration Services) within our Digital Customer Experience organization. Your role will involve designing, developing, and implementing effective ETL processes to consolidate and integrate data from various sources into our data warehouse. Your expertise in SSIS will play a crucial role in ensuring the accuracy, reliability, and performance of our data pipelines. Collaboration with data analysts, business intelligence developers, and other stakeholders will be essential to understand data requirements and deliver high-quality data solutions. **Key Responsibilities:** - Design, develop, and implement effective ETL processes using SQL Server Integration Services (SSIS). - Write complex SQL queries and stored procedures to support data integration. - Collaborate with stakeholders to gather requirements and translate them into technical specifications. - Ensure data quality and governance practices are maintained. - Optimize and tune the performance of ETL processes. - Work on version control and deployment processes for SSIS packages. - Explore and leverage cloud platforms and services like AWS and Azure. - Willingness to work in the afternoon shift from 3 PM to 12 AM IST. **Qualifications Required:** - Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. - 4+ years of experience in ETL development and data integration. - Strong expertise in SQL Server Integration Services (SSIS). - Proficiency in writing complex SQL queries and stored procedures. - Experience with data warehousing concepts and database design. - Strong analytical and problem-solving skills. - Excellent communication and collaboration abilities. - Familiarity with other ETL tools like Informatica and Talend is a plus. - Knowledge of data quality and data governance practices. - Understanding of cloud platforms and services (e.g. AWS, Azure). This role will provide you with opportunities to enhance your technical skills, leadership abilities, and relationship skills through various in-house training programs. Additionally, the transparent work culture at our organization encourages the sharing of ideas and initiatives at the enterprise level, ensuring your career growth and success. At TELUS Digital, we are committed to creating a diverse and inclusive workplace, where all aspects of employment are based on qualifications and performance, without any discrimination based on diversity characteristics. You will be working as an ETL Developer specializing in SSIS (SQL Server Integration Services) within our Digital Customer Experience organization. Your role will involve designing, developing, and implementing effective ETL processes to consolidate and integrate data from various sources into our data warehouse. Your expertise in SSIS will play a crucial role in ensuring the accuracy, reliability, and performance of our data pipelines. Collaboration with data analysts, business intelligence developers, and other stakeholders will be essential to understand data requirements and deliver high-quality data solutions. **Key Responsibilities:** - Design, develop, and implement effective ETL processes using SQL Server Integration Services (SSIS). - Write complex SQL queries and stored procedures to support data integration. - Collaborate with stakeholders to gather requirements and translate them into technical specifications. - Ensure data quality and governance practices are maintained. - Optimize and tune the performance of ETL processes. - Work on version control and deployment processes for SSIS packages. - Explore and leverage cloud platforms and services like AWS and Azure. - Willingness to work in the afternoon shift from 3 PM to 12 AM IST. **Qualifications Required:** - Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. - 4+ years of experience in ETL development and data integration. - Strong expertise in SQL Server Integration Services (SSIS). - Proficiency in writing complex SQL queries and stored procedures. - Experience with data warehousing concepts and database design. - Strong analytical and problem-solving skills. - Excellent communication and collaboration abilities. - Familiarity with other ETL tools like Informatica and Talend is a plus. - Knowledge of data quality and data governance practices. - Understanding of cloud platforms and services (e.g. AWS, Azure). This role will provide you with opportunities to enhance your technical skills, leadership abilities, and relationship skills through various in-house training programs. Additionally, the transparent work culture at our organization encourages the sharing of ideas and initiatives at the enterprise level, ensuring your career growth and success. At TELUS Digital, we are committed to creating a diverse and inclusive workplace, where all aspects of employment are based on qualifications and performance, without
ACTIVELY HIRING
posted 1 week ago
experience2 to 6 Yrs
location
Noida, Uttar Pradesh
skills
  • ETL
  • Python
  • Airflow
  • SQL
  • SQL Server
  • Oracle
  • PostgreSQL
  • Data Warehousing
  • Redshift
  • Dimensional Modelling
Job Description
Role Overview: As an ETL Specialist at BOLD, you will be responsible for building the architecture and maintaining a robust, scalable, and sustainable business intelligence platform. Working closely with the Data Team, you will be handling highly scalable systems, complex data models, and a large amount of transactional data. Key Responsibilities: - Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes - Utilize Python and Airflow to integrate data from across the business into the data warehouse - Integrate third party data into the data warehouse like Google Analytics, Google Ads, Iterable Qualifications Required: - Experience working as an ETL developer in a Data Engineering, Data Warehousing, or Business Intelligence team - Understanding of data integration/data engineering architecture and familiarity with ETL standards, methodologies, guidelines, and techniques - Proficiency in Python programming language and its packages like Pandas, NumPy - Strong understanding of SQL queries, aggregate functions, complex joins, and performance tuning - Exposure to Databases like Redshift/SQL Server/Oracle/PostgreSQL (any one of these) - Broad understanding of data warehousing and dimensional modeling concepts Additional Details: At BOLD, the Business Intelligence (BI) team is responsible for managing all aspects of the organization's BI strategy, projects, and systems. The BI team enables business leaders to make data-driven decisions by providing reports and analysis. It plays a crucial role in developing and managing a latency-free credible enterprise data warehouse, serving as a data source for decision-making across various functions of the organization. The team comprises sub-components focusing on Data analysis, ETL, Data Visualization, and QA, leveraging technologies like Snowflake, Sisense, Microstrategy, Python, R, and Airflow for operations. About BOLD: BOLD is a global organization that helps people find jobs and transform their work lives. The company creates digital products that empower individuals worldwide to build stronger resumes, cover letters, and CVs. With a strong focus on diversity and inclusion, BOLD celebrates and promotes a culture of growth, success, and professional fulfillment for its employees.,
ACTIVELY HIRING
posted 3 weeks ago
experience2 to 6 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • Airflow
  • SQL
  • SQL Server
  • Oracle
  • PostgreSQL
  • Data Warehousing
  • Redshift
  • Dimensional Modelling
Job Description
Role Overview: You will be joining as an ETL Specialist at BOLD, where you will play a crucial role in building and maintaining a highly scalable and sustainable business intelligence platform. Working closely with the Data Team, you will be involved in managing complex data models, highly scalable systems, and a large amount of transactional data. Key Responsibilities: - Architect, develop, and maintain a highly scalable data warehouse - Build and maintain ETL processes using Python and Airflow - Integrate third-party data sources like Google Analytics, Google Ads, and Iterable into the data warehouse Qualifications Required: - Minimum 2.5 years of experience working as an ETL developer in Data Engineering, Data Warehousing, or Business Intelligence team - Strong understanding of data integration and data engineering architecture, ETL standards, methodologies, guidelines, and techniques - Hands-on experience with Python programming language and its packages like Pandas, NumPy - Proficiency in SQL queries, aggregate functions, complex joins, and performance tuning - Exposure to databases like Redshift, SQL Server, Oracle, PostgreSQL - Broad understanding of data warehousing and dimensional modeling concepts Additional Details: BOLD is a global organization that helps people find jobs by creating digital products that empower millions of individuals worldwide to build stronger resumes and secure the right job. The company values diversity, inclusion, and professional growth, offering outstanding compensation, full health benefits, flexible time away, and additional benefits like internet reimbursement, catered meals, and certification policies.,
ACTIVELY HIRING
posted 2 months ago

ETL Testing

Allime Tech Solutions
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • ETL testing
  • data validation
  • data analysis
  • SQL
  • ETL tools
  • JIRA
  • HP ALM
  • communication skills
  • analytical skills
  • AgileScrum
  • problemsolving skills
Job Description
As an ETL Tester at Allime Tech Solutions, you will be responsible for ensuring the quality and accuracy of data through ETL testing, data validation, and data analysis. Your role will involve utilizing your Bachelor's degree in Computer Science or a related field along with your 3+ years of experience in ETL testing. You will need to showcase your strong SQL skills and proficiency in working with ETL tools to excel in this position. Key Responsibilities: - Perform ETL testing to validate and analyze data - Utilize strong SQL skills for data validation - Work with ETL tools such as JIRA and HP ALM for testing - Collaborate effectively with cross-functional teams - Prioritize work effectively on multiple projects - Apply Agile/Scrum software development methodology - Demonstrate strong analytical and problem-solving skills Qualifications Required: - Bachelor's degree in Computer Science or related field - 3+ years of experience in ETL testing, data validation, and data analysis - Strong SQL skills and experience with ETL tools - Experience with testing tools such as JIRA and HP ALM - Excellent communication skills and ability to collaborate with cross-functional teams - Ability to work independently and as a team member - Strong analytical and problem-solving skills Allime Tech Solutions is dedicated to empowering innovation through technology and connecting talent with opportunities. Committed to integrity and excellence, we strive to provide tailored solutions for our clients. Join us in creating a future where everyone can thrive.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • PostgreSQL
  • Microsoft SQL Server
  • Elasticsearch
  • MongoDB
  • SQL
  • Python
  • NumPy
  • Docker
  • Kubernetes
  • ETL Data Engineer
  • CURA
  • pandas
  • Polars
Job Description
As an ETL Data Engineer, your role will involve re-engineering existing data pipelines to extract data from a new source (PostgreSQL / CURA system) instead of Microsoft SQL Server while maintaining data quality, system performance, and end-user experience during migration. Key Responsibilities: - Analyze current ETL pipelines and their dependencies on Microsoft SQL Server. - Design and implement modifications to redirect ETL extractions to PostgreSQL (CURA) while keeping current transformations and load logic for Elasticsearch and MongoDB. - Ensure end-to-end data integrity, quality, and freshness post source switch. - Write efficient SQL queries for data extraction from the new source. - Perform performance testing to validate no degradation in pipeline throughput or latency. - Collaborate with DevOps and platform teams to containerize, orchestrate, and deploy updated ETLs using Docker and Kubernetes. - Monitor post-deployment performance and proactively address any production issues. - Document design, code, data mappings, and operational runbooks. Skills and Qualifications Required: - Strong experience in building and maintaining large-scale distributed data systems. - Expertise in Python, specifically data analysis/manipulation libraries like pandas, NumPy, and Polars. - Advanced SQL development skills with a focus on performance optimization. - Proficiency in Docker and Kubernetes. - Familiarity with Elasticsearch and MongoDB as data stores. - Experience working in production environments with mission-critical systems.,
ACTIVELY HIRING
posted 2 weeks ago

AxiomSL Developer

Nixora Group
experience3 to 7 Yrs
location
All India, Gurugram
skills
  • SQL
  • ETL
  • Python
  • UNIX scripting
  • Banking
  • Financial Services
  • Regulatory Compliance
  • Data Governance
  • AxiomSL ControllerView
  • Data Modelling
Job Description
As an AxiomSL Developer joining the Credit Risk Regulatory Reporting team, your role will involve designing, implementing, and maintaining reporting solutions on the AxiomSL ControllerView platform. Your primary focus will be on ensuring the accurate and timely delivery of credit risk regulatory reports to APRA and other regulatory bodies while aligning with internal risk and compliance standards. Key Responsibilities: - Develop, configure, and maintain AxiomSL ControllerView data models, workflows, and reports for credit risk regulatory submissions. - Design and implement data sourcing, transformation, and mapping logic for regulatory templates (e.g., ARF 223, ARF 220, ARF 112). - Collaborate with Risk, Finance, and Data teams to ensure data accuracy and completeness. - Participate in full SDLC activities: requirement analysis, design, development, testing, and deployment of AxiomSL components. - Optimize existing AxiomSL data flows and calculation modules for performance and compliance. - Conduct impact analysis and regression testing for regulatory change implementations. - Support production runs and troubleshoot data or report generation issues. - Maintain documentation for data lineage, configuration changes, and operational processes. Required Skills And Experience: - 3+ years of hands-on experience with AxiomSL ControllerView (design, development, and implementation). - Strong knowledge of credit risk regulatory reporting (e.g., APRA ARF reports, Basel III/IV frameworks). - Proficient in SQL, data modeling, and ETL concepts. - Experience working in a banking or financial services environment with exposure to regulatory compliance. - Strong analytical and problem-solving skills. - Understanding of data governance and control frameworks. - Experience with source systems such as Moodys Risk Authority, OFSAA, or other risk data warehouses is advantageous. - Familiarity with Python, UNIX scripting, or automation tools is a plus. Qualifications: - Bachelor's degree in Computer Science, Finance, Information Systems, or related discipline. - Relevant certifications in AxiomSL, risk management, or data engineering are desirable. Soft Skills: - Strong communication and stakeholder management skills. - Ability to work independently and within cross-functional teams. - Detail-oriented approach to data accuracy and regulatory compliance. If you decide to join, you will: - Work on a mission-critical platform within a Tier 1 bank. - Gain exposure to complex regulatory reporting landscapes (APRA, Basel, IFRS). - Be part of a collaborative environment focused on technology-driven compliance solutions. As an AxiomSL Developer joining the Credit Risk Regulatory Reporting team, your role will involve designing, implementing, and maintaining reporting solutions on the AxiomSL ControllerView platform. Your primary focus will be on ensuring the accurate and timely delivery of credit risk regulatory reports to APRA and other regulatory bodies while aligning with internal risk and compliance standards. Key Responsibilities: - Develop, configure, and maintain AxiomSL ControllerView data models, workflows, and reports for credit risk regulatory submissions. - Design and implement data sourcing, transformation, and mapping logic for regulatory templates (e.g., ARF 223, ARF 220, ARF 112). - Collaborate with Risk, Finance, and Data teams to ensure data accuracy and completeness. - Participate in full SDLC activities: requirement analysis, design, development, testing, and deployment of AxiomSL components. - Optimize existing AxiomSL data flows and calculation modules for performance and compliance. - Conduct impact analysis and regression testing for regulatory change implementations. - Support production runs and troubleshoot data or report generation issues. - Maintain documentation for data lineage, configuration changes, and operational processes. Required Skills And Experience: - 3+ years of hands-on experience with AxiomSL ControllerView (design, development, and implementation). - Strong knowledge of credit risk regulatory reporting (e.g., APRA ARF reports, Basel III/IV frameworks). - Proficient in SQL, data modeling, and ETL concepts. - Experience working in a banking or financial services environment with exposure to regulatory compliance. - Strong analytical and problem-solving skills. - Understanding of data governance and control frameworks. - Experience with source systems such as Moodys Risk Authority, OFSAA, or other risk data warehouses is advantageous. - Familiarity with Python, UNIX scripting, or automation tools is a plus. Qualifications: - Bachelor's degree in Computer Science, Finance, Information Systems, or related discipline. - Relevant certifications in AxiomSL, risk management, or data engineering are desirable. Soft Skills: - Strong communication and stakeholder management skills. - Ability to work independently and within cross-functional teams. - Detail-oriented approach to data accuracy and
ACTIVELY HIRING
posted 2 months ago

F5 LOAD BALANCER

Live Connections
experience4 to 8 Yrs
location
Noida, Uttar Pradesh
skills
  • TCPDump
  • Wireshark
  • Network Traffic Analysis
  • F5 BigIP LTM
  • F5 BigIP APM
  • F5 BigIP ASM
  • F5 BigIP AFM
  • iRules
  • Fortinet Firewall
  • Network Switching
  • Network Routing
Job Description
As an experienced professional with 4 to 6 years of experience, you should have significant expertise in supporting and maintaining F5 Big-IP LTM/APM/ASM/AFM in operations. Your responsibilities will include setting up, maintaining, upgrading, and replacing the latest generation of F5 devices. It is essential for you to demonstrate proficiency in crafting and understanding iRules. Additionally, your experience with tools like TCPDump, Wireshark, and analyzing network traffic will be valuable in this role. Key Responsibilities: - Perform daily operational tasks submitted by customers through Service management tools - Troubleshoot network problems, network device configurations, and coordinate with various department administrators to facilitate connectivity issue resolution Qualifications Required: - Minimum 4+ years of hands-on experience in F5 LTM/APM/ASM/AFM domain - Proficiency in Fortinet Firewall - Hands-on experience in network switching and routing,
ACTIVELY HIRING
posted 2 months ago

Senior ETL Developer

DataFlow Group
experience10 to 14 Yrs
location
Noida, Uttar Pradesh
skills
  • ETL
  • Data Streaming
  • AWS
  • Talend
  • Informatica
  • Apache Kafka
  • SQL
  • Python
  • Scala
  • Relational databases
  • AWS Kinesis
  • AWS S3
  • AWS Glue
  • AWS Redshift
  • AWS Lake Formation
  • AWS EMR
  • NoSQL databases
Job Description
Role Overview: You are a highly skilled and experienced Senior ETL & Data Streaming Engineer with over 10 years of experience. Your main responsibility will be to design, develop, and maintain robust data pipelines. Your expertise in both batch ETL processes and real-time data streaming technologies, along with hands-on experience with AWS data services, will be crucial. Your track record in working with Data Lake architectures and traditional Data Warehousing environments will be essential for this role. Key Responsibilities: - Design, develop, and implement highly scalable, fault-tolerant, and performant ETL processes using industry-leading ETL tools to extract, transform, and load data from various source systems into our Data Lake and Data Warehouse. - Architect and build batch and real-time data streaming solutions using technologies like Talend, Informatica, Apache Kafka, or AWS Kinesis to support immediate data ingestion and processing requirements. - Utilize and optimize a wide array of AWS data services, including AWS S3, AWS Glue, AWS Redshift, AWS Lake Formation, AWS EMR, and others, to build and manage data pipelines. - Collaborate with data architects, data scientists, and business stakeholders to understand data requirements and translate them into efficient data pipeline solutions. - Ensure data quality, integrity, and security across all data pipelines and storage solutions. - Monitor, troubleshoot, and optimize existing data pipelines for performance, cost-efficiency, and reliability. - Develop and maintain comprehensive documentation for all ETL and streaming processes, data flows, and architectural designs. - Implement data governance policies and best practices within the Data Lake and Data Warehouse environments. - Mentor junior engineers and contribute to fostering a culture of technical excellence and continuous improvement. - Stay abreast of emerging technologies and industry best practices in data engineering, ETL, and streaming. Required Qualifications: - 10+ years of progressive experience in data engineering, with a strong focus on ETL, ELT, and data pipeline development. - Deep expertise in ETL Tools: Extensive hands-on experience with commercial or open-source ETL tools (Talend). - Strong proficiency in Data Streaming Technologies: Proven experience with real-time data ingestion and processing using platforms such as AWS Glue, Apache Kafka, AWS Kinesis, or similar. - Extensive AWS Data Services Experience: Proficiency with AWS S3 for data storage and management, hands-on experience with AWS Glue for ETL orchestration and data cataloging, strong knowledge of AWS Redshift for data warehousing and analytics, familiarity with AWS Lake Formation for building secure data lakes, good to have experience with AWS EMR for big data processing. - Data Warehouse (DWH) Knowledge: Strong background in traditional data warehousing concepts, dimensional modeling (Star Schema, Snowflake Schema), and DWH design principles. - Programming Languages: Proficient in SQL and at least one scripting language (e.g., Python, Scala) for data manipulation and automation. - Database Skills: Strong understanding of relational databases and NoSQL databases. - Version Control: Experience with version control systems (e.g., Git). - Problem-Solving: Excellent analytical and problem-solving skills with a keen eye for detail. - Communication: Strong verbal and written communication skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. Preferred Qualifications: - Certifications in AWS Data Analytics or other relevant areas.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Process Mining
  • ETL
  • DAX
  • SQL
  • Data Modeling
  • Stakeholder Management
  • Power BI Developer
  • Power Query
Job Description
You are an experienced Power BI Developer specializing in Process Mining, responsible for turning complex business data into powerful insights and driving process optimization using advanced BI capabilities. Key Responsibilities: - Lead end-to-end Process Mining initiatives using Power BI & Power Automate Process Mining - Perform ETL and integrate data from ERP, CRM, databases, and cloud sources - Build & optimize data models using DAX and Power Query - Develop impactful dashboards and process insights highlighting bottlenecks & automation opportunities - Define & monitor KPIs with business teams - Collaborate closely with process owners, BA teams, and IT stakeholders - Continuously enhance performance and stay updated on the latest BI & process mining trends Required Skills: - 5+ years as a Power BI Developer - Hands-on experience with Process Mining (Power Automate Process Mining preferred) - Strong skills in DAX, Power Query, SQL, and data modeling (star/snowflake schema) - Solid ETL understanding and excellent analytical/problem-solving skills - Strong communication and stakeholder management abilities - Exposure to Power Platform (Power Apps, Power Automate) is a big plus Good to Have: - Experience with tools like Celonis or UiPath Process Mining - Microsoft certifications - Agile environment experience Education: - Bachelors degree in Computer Science, IT, Data Analytics, BI, or related field If you are interested in this role, feel free to DM me or share your resume. Please like, share, or tag someone who might be a great fit!,
ACTIVELY HIRING
posted 3 weeks ago

SQL Developer

Mobile Programming
experience4 to 8 Yrs
location
All India, Gurugram
skills
  • SQL
  • SSRS
  • SSIS
  • TSQL
  • ETL
  • Data Warehousing
  • Data Migration
  • Data Cleansing
  • Data Transformation
  • Query Tuning
  • Indexing
Job Description
Role Overview: As an SQL Developer with expertise in SSRS and SSIS, you will be tasked with designing, developing, and maintaining SQL databases, data integration packages, and business intelligence reports to facilitate efficient business operations and decision-making. Key Responsibilities: - Design, develop, and maintain SQL databases, stored procedures, functions, triggers, and views. - Develop, implement, and manage ETL processes using SSIS for data extraction, transformation, and loading. - Create, optimize, and manage SSRS reports and dashboards based on business requirements. - Ensure data integrity, performance tuning, and query optimization. - Collaborate with business analysts and application developers to gather and understand reporting requirements. - Monitor database performance and resolve issues related to data access and report generation. - Automate repetitive data processes and reporting tasks. - Perform data validation, testing, and documentation of developed solutions. - Participate in database design and implementation of new projects or enhancements. Qualification Required: - Bachelors degree in Computer Science, Information Technology, or a related field. - 4+ years of hands-on experience with MS SQL Server (2016 or later). - Strong expertise in T-SQL (queries, joins, indexing, query tuning). - Proven experience developing and deploying SSRS reports and SSIS packages. - Good understanding of ETL concepts, data warehousing, and reporting tools. - Experience with data migration, data cleansing, and data transformation processes. - Strong analytical and problem-solving skills. - Familiarity with version control systems (e.g., Git, TFS) is a plus. - Excellent communication and teamwork skills. Please note that the job type is full-time with benefits including health insurance and provident fund. The work location is in person. Role Overview: As an SQL Developer with expertise in SSRS and SSIS, you will be tasked with designing, developing, and maintaining SQL databases, data integration packages, and business intelligence reports to facilitate efficient business operations and decision-making. Key Responsibilities: - Design, develop, and maintain SQL databases, stored procedures, functions, triggers, and views. - Develop, implement, and manage ETL processes using SSIS for data extraction, transformation, and loading. - Create, optimize, and manage SSRS reports and dashboards based on business requirements. - Ensure data integrity, performance tuning, and query optimization. - Collaborate with business analysts and application developers to gather and understand reporting requirements. - Monitor database performance and resolve issues related to data access and report generation. - Automate repetitive data processes and reporting tasks. - Perform data validation, testing, and documentation of developed solutions. - Participate in database design and implementation of new projects or enhancements. Qualification Required: - Bachelors degree in Computer Science, Information Technology, or a related field. - 4+ years of hands-on experience with MS SQL Server (2016 or later). - Strong expertise in T-SQL (queries, joins, indexing, query tuning). - Proven experience developing and deploying SSRS reports and SSIS packages. - Good understanding of ETL concepts, data warehousing, and reporting tools. - Experience with data migration, data cleansing, and data transformation processes. - Strong analytical and problem-solving skills. - Familiarity with version control systems (e.g., Git, TFS) is a plus. - Excellent communication and teamwork skills. Please note that the job type is full-time with benefits including health insurance and provident fund. The work location is in person.
ACTIVELY HIRING
posted 1 month ago

Integration Developer

Celebal Technologies
experience1 to 5 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • Java
  • Flask
  • Django
  • Spring Boot
  • Relational databases
  • OAuth
  • Azure DevOps
  • GitHub
  • Kubernetes
  • Kafka
  • OData
  • SOAP
  • Angular
  • Distributed systems
  • CNET Core
  • Nodejs
  • Fast API
  • Expressjs
  • NestJS
  • Azure Services
  • Logic Apps
  • Azure Functions
  • API Management
  • Service Bus
  • Event Hub
  • Nonrelational databases
  • OpenID Connect
  • API security
  • Graph API
  • Azure Networking services
  • VNets
  • NSGs
  • Load Balancers
  • Application Gateways
  • React
  • Razor Pages
  • Blazor
  • AIS solutions
  • Event driven application
Job Description
As a Software Developer in this role, you will be responsible for the following: - Developing scalable and efficient applications primarily using C#/.NET Core (MVC, Web API), with experience in Python (Fast API, Flask, Django), Node.js (Express.js, NestJS), or Java (Spring Boot) as secondary technologies. It is essential to follow SOLID principles to design modular, testable, and high-performance APIs. - Demonstrating strong expertise in implementing and understanding various testing methodologies, including unit testing, smoke testing, and integration testing. - Designing and Building solutions by leveraging Azure Services like Logic Apps, Azure Functions, API Management, Service Bus, and Event Hub. - Designing and optimizing relational and non-relational databases. - Implementing secure REST APIs with expertise in OAuth, OpenID Connect, and API security. - Utilizing Azure DevOps and GitHub for version control and CI/CD. - Contributing to code reviews, best practices, and documentation. Good to have: - Expertise in creating efficient and scalable system architectures to ensure high performance, reliability, and cost-effectiveness. - Knowledge of Kubernetes, Kafka, and containerization. - Familiarity with OData, Graph API, and SOAP. - Understanding of Azure Networking services like VNets, NSGs, Load Balancers, Application Gateways, etc. - Knowledge of frontend frameworks and technologies such as React, Angular, Razor Pages, or Blazor. - Experience in creating resilient and highly available AIS solutions. - Understanding of distributed systems and event-driven applications. For any additional details about the company, please refer to the original job description.,
ACTIVELY HIRING
posted 2 months ago
experience7 to 11 Yrs
location
Noida, Uttar Pradesh
skills
  • SQL
  • Qlik Sense
  • SSIS
  • Data transformation
  • JOINS
  • NPrinting
  • ETL processes
  • Set Analysis
  • Data formatting
  • Logic implementation
Job Description
Role Overview: As a Qlik Sense Developer, you will be responsible for developing and maintaining Qlik Sense applications to meet the data visualization needs of the organization. Utilize your strong background in SQL and Qlik Sense to create interactive dashboards and reports that provide valuable insights to the stakeholders. Key Responsibilities: - Develop and optimize SQL queries, including tuning, stored procedures development, and table management - Utilize Qlik Sense to create interactive dashboards and visualizations - Manage types of loads and create new QVDs for data storage - Implement N-Printing for report generation - Develop and customize charts and load scripts in Qlik Sense - Apply business logic such as Set Analysis, data formatting, and data transformation - Proficient in JOINS and logic implementation in SQL - Experience with SSIS for ETL processes Qualifications Required: - 7 to 10 years of experience in Qlik Sense development and SQL - Strong data visualization skills - Proficiency in SSIS, query optimization, ETL processes, and data transformation - Knowledge of creating and managing charts, tables, and dashboards in Qlik Sense - Familiarity with Set Analysis, business logic implementation, and data formatting in Qlik Sense,
ACTIVELY HIRING
posted 2 weeks ago
experience10 to 16 Yrs
location
Noida, Uttar Pradesh
skills
  • data integration
  • data migration
  • data quality
  • Salesforce
  • PostgreSQL
  • REST
  • SOAP
  • Azure
  • Informatica PowerCenter
  • data models
  • team management
  • documentation
  • collaboration
  • communication skills
  • ETL processes
  • IDMC
  • Informatica IDMC
  • ETL processes
  • data transformations
  • API integration
  • cloud platform concepts
  • database knowledge
  • Data Warehouse DWH concepts
  • Talent solutions
  • clientfacing roles
Job Description
As an ETL Architect Engineer with over 10 years of experience, you will be required to have the following core expertise: - Extensive experience in ETL processes, data integration, and data migration. - Strong knowledge of IDMC (Intelligent Data Management Cloud). - Hands-on experience with Informatica IDMC components: CAI, CDI, IDQ. Your technical skills should include proficiency in ETL processes, data transformations, and data quality. You should have experience with Salesforce and PostgreSQL components. Additionally, hands-on API integration using CDI and CAI, real-time data processing with REST and SOAP implementations, familiarity with cloud platform concepts, and experience with Informatica PowerCenter and strong database knowledge are essential. You should be well-versed in Data Warehouse (DWH) concepts and large-volume data implementations. Designing and implementing data models for data-driven initiatives should be one of your strong suits. Preferred experience for this role includes talent solutions experience and previous involvement in client-facing roles and team management. Your responsibilities will involve creating and maintaining comprehensive documentation for ETL processes, data mappings, and workflows. Collaboration with business analysts, data architects, and stakeholders to understand requirements and deliver solutions will also be a key part of your role. Moreover, soft skills such as strong communication skills for cross-functional collaboration will be essential for success in this position.,
ACTIVELY HIRING
posted 1 week ago
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • F5 Load Balancer
  • DNS
  • TCPIP
  • HTTPS
  • BGP
  • OSPF
  • MPLS
  • QoS
  • ACLs
  • DHCP
  • DNSSEC
  • Python
  • Ansible
  • Data Center LAN Fabric
  • SSLTLS
Job Description
As a Sr Network Engineer (L3) at Adobe, you will be responsible for deploying and supporting high-availability, scalable, and secure network solutions for enterprise, private cloud, and data center environments. Your expertise in F5 Load Balancers and Data Center LAN Fabric will be crucial for optimizing application delivery, performance, and reliability. Key Responsibilities: - Develop, deploy, and maintain F5 BIG-IP solutions including Local Traffic Manager, Global Traffic Manager/Domain Name System, Application Security Manager, and Application Performance Management. - Integrate F5 with cloud platforms (AWS, Azure, GCP) and on-premises infrastructure. - Troubleshoot and resolve complex load balancing and application delivery issues. - Implement contemporary network frameworks like EVPN, VxLAN, and SDN for Data Center LAN Fabric. - Operate switches, routers, and fabric controllers from Cisco, Arista, Juniper, Versa, NSX. - Understand DNS, DHCP, and IPAM solutions (Infoblox, F5 GTM/DNS). Qualifications & Skills: - 6+ years of experience in enterprise-grade network engineering. - Deep expertise in F5 modules (LTM, GTM/DNS, ASM, APM) and iRules scripting. - Expertise in network fabrics, data center LAN switching, and automation using Python and Ansible. - Strong understanding of networking protocols such as TCP/IP, SSL/TLS, HTTP/S, BGP, OSPF, MPLS, QoS, ACLs, DHCP, and DNSSEC. - Experience with F5, Cisco, Arista. - Certifications: F5 Certified Administrator/Expert, CCNP. - Excellent troubleshooting, analytical, and communication skills. Adobe is committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. Your contributions as a Sr Network Engineer will play a vital role in transforming how companies interact with customers across every screen.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter