informatica-etl-developer-jobs-in-vellore, Vellore

73 informatica Etl Developer Jobs nearby Vellore

Toggle to save search
posted 3 weeks ago
experience7 to 12 Yrs
location
Chennai, Pune
skills
  • testing
  • webdriver
  • etl testing
  • automation
  • automation testing
  • python
  • database testing
  • selenium testing
  • pytest
Job Description
We are seeking a skilled ETL & Python Automation Tester to join our QA team. The ideal candidate will have strong experience in ETL testing, data validation, and Python-based automation to ensure the integrity, accuracy, and quality of data across data pipelines, warehouses, and reporting systems. Role : Automation Testing - ETL+ Python Automation Mode : HybridLocation : Pune/Chennai Key Responsibilities Design, develop, and execute ETL test cases for data ingestion, transformation, and loading processes. Validate data movement between source, staging, and target systems to ensure end-to-end data integrity. Automate ETL and data validation tests using Python and relevant testing frameworks. Create and maintain automated scripts for regression and functional testing. Work closely with Data Engineers, Developers, and Business Analysts to understand data flows and business rules. Perform data reconciliation, transformation validation, and schema testing. Implement and maintain test frameworks using pytest, unittest, or Robot Framework. Report and track defects in JIRA (or similar tools) and work with teams to resolve them. Contribute to test strategy, planning, and continuous integration (CI/CD) processes. Required Skills & Qualifications Bachelors degree in Computer Science, Information Technology, or related field. 3+ years of experience in software testing, with a focus on ETL and data testing. Strong knowledge of SQL for data verification and validation. Hands-on experience with Python scripting for automation. Experience with ETL tools (e.g., Informatica, Talend, SSIS, DataStage, Glue, etc.). Familiarity with data warehouses (Snowflake, Redshift, BigQuery, etc.). Knowledge of automation frameworks (pytest, unittest, Robot Framework). Experience with version control systems (Git) and CI/CD pipelines (Jenkins, GitLab CI, etc.). Strong analytical and problem-solving skills.
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 1 week ago

ETL Test Lead

Flyers Soft
experience7 to 11 Yrs
location
Chennai, Tamil Nadu
skills
  • Oracle SQL Developer
  • Informatica PowerCenter
  • Strong SQL Database Expertise
  • ETL Testing Informatica
  • ETL Automation IDTW
  • API Interface Testing
  • Experience in Banking
  • Financial Services FS domain
  • SQL Server Management Studio SSMS
  • Toad for OracleSQL Server
  • ETL Data Integration Tools
Job Description
You have 7+ years of experience and possess the following technical skills: - Strong SQL & Database Expertise - ETL Testing in Informatica - ETL Automation (IDTW) - API & Interface Testing - Experience in Banking or Financial Services (FS) domain Additionally, you should have expertise in any of the following tools: - SQL Server Management Studio (SSMS) - Oracle SQL Developer - Toad for Oracle/SQL Server - ETL & Data Integration Tools - Informatica PowerCenter If you are interested in this opportunity, please share your resume with gopinath.sonaiyan@flyerssoft.com.,
ACTIVELY HIRING
posted 2 months ago

Informatica MDM Developer

Realtek Consulting LLC
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Java Development
  • Informatica MDM
  • BES
  • ActiveVOS
  • MDM Hub
  • e360 Application Development
  • Provisioning Tool
Job Description
Role Overview: As an Informatica MDM Developer at Realtek Consulting LLC, you will play a crucial role in configuring and managing MDM Hub components, implementing Match & Merge rules, developing and maintaining e360 User Interfaces, designing and implementing BES External Calls, integrating BES services with external systems, developing and managing ActiveVOS Workflows, implementing workflow automation, using Provisioning Tool for data modeling, defining data quality compliance, and writing custom Java to extend MDM functionalities. Key Responsibilities: - Configure and manage MDM Hub components like Landing, Staging, Base Objects, and Hierarchies. - Implement Match & Merge rules for data consolidation. - Develop and maintain e360 User Interfaces for MDM. - Design and implement BES External Calls for data validation and retrieval. - Integrate BES services with external systems via REST/SOAP APIs. - Develop, customize, and manage ActiveVOS Workflows. - Implement workflow automation for Data Stewardship and approvals. - Use Provisioning Tool for data modeling and UI configuration. - Define data quality compliance. - Write custom Java to extend MDM functionalities. - Develop code for data processing and Custom api. Qualification Required: - 7+ years of experience in Informatica MDM and related tools. - Strong understanding of MDM architecture and data governance. - Hands-on experience (3 Years) with Java, SQL, and REST APIs. - Knowledge of ActiveVOS workflow design and troubleshooting. Note: Realtek Consulting LLC, founded in 2011, is a technology firm specializing in software, engineering, staffing, and digital solutions for organizations globally. The company focuses on driving business transformation through advanced technology and has established strong client partnerships in the United States, the United Kingdom, Europe, Australia, and India.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 3 weeks ago
experience4 to 8 Yrs
location
Coimbatore, All India
skills
  • Talend
  • Informatica
  • Datastage
  • Oracle
  • SQL Server
  • JIRA
  • SVN
  • Teradata
Job Description
You will be responsible for developing ETL processes for our reputed client based in Coimbatore. Your main responsibilities will include: - Utilizing Talend or any other ETL tools such as Informatica, Datastage, or Talend to design and implement efficient ETL processes - Working with databases such as Teradata, Oracle, or SQL Server to extract, transform, and load data - Familiarity with supporting tools like JIRA and SVN to track and manage development tasks effectively Your qualifications should include: - At least 4 years of experience in ETL development roles - Proficiency in ETL tools like Talend - Strong database skills, with a preference for Teradata, Oracle, or SQL Server - Ability to work within a notice period of immediate to 30 days Please note, the provided job description did not include any additional details about the company. You will be responsible for developing ETL processes for our reputed client based in Coimbatore. Your main responsibilities will include: - Utilizing Talend or any other ETL tools such as Informatica, Datastage, or Talend to design and implement efficient ETL processes - Working with databases such as Teradata, Oracle, or SQL Server to extract, transform, and load data - Familiarity with supporting tools like JIRA and SVN to track and manage development tasks effectively Your qualifications should include: - At least 4 years of experience in ETL development roles - Proficiency in ETL tools like Talend - Strong database skills, with a preference for Teradata, Oracle, or SQL Server - Ability to work within a notice period of immediate to 30 days Please note, the provided job description did not include any additional details about the company.
ACTIVELY HIRING
posted 2 months ago

Informatica

Cognizant
experience6 to 10 Yrs
location
Chennai, Tamil Nadu
skills
  • SQL Server
  • data integration
  • database design
  • optimization
  • data quality
  • data governance
  • performance tuning
  • scripting languages
  • Informatica Cloud
  • ETL processes
  • Agile development methodologies
  • cloudbased data solutions
Job Description
As a Senior Developer with 6 to 9 years of experience, your role will involve working with SQL Server Informatica Cloud and Informatica to develop and maintain data systems. You will be responsible for ensuring high performance and availability of our data systems by collaborating with cross-functional teams to gather and analyze business requirements. **Key Responsibilities:** - Develop and maintain SQL Server databases for high performance and availability. - Design and implement ETL processes using Informatica Cloud. - Provide technical expertise in SQL Server and Informatica to optimize data workflows. - Troubleshoot and resolve issues related to data integration and database performance. - Ensure data quality and integrity through best practices and standards. - Create and maintain documentation for data processes and systems. - Conduct code reviews and provide feedback for code quality. - Participate in designing data solutions to meet business needs. - Monitor and optimize ETL processes for efficient data processing. - Stay updated with the latest trends in data management and integration. - Provide support and guidance to junior developers. - Contribute to the continuous improvement of data systems and processes. **Qualifications:** - Strong experience with SQL Server, including database design and optimization. - Hands-on experience with Informatica Cloud for data integration and transformation. - Excellent problem-solving skills and attention to detail. - Strong communication and collaboration skills. - Ability to work independently and as part of a team. - Strong understanding of data quality and governance principles. - Proactive approach to identifying and addressing potential issues. **Certifications Required:** - Certified Informatica Cloud Data Integration Specialist - Microsoft Certified: Azure Data Engineer Associate,
ACTIVELY HIRING
posted 2 months ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • SAP Data services
  • ETL
  • SQL
  • Oracle
  • SAP BO
  • Informatica
  • Data processing
  • Data cleansing
  • Problem solving
  • Analytical skills
  • Stakeholder Management
  • Data warehouse
  • Data structure design
Job Description
At Capgemini Engineering, you will be part of a global team of engineers, scientists, and architects who collaborate to help innovative companies unleash their potential. From autonomous cars to life-saving robots, your expertise in tools like SAP Data services, ETL, and Data warehouse concepts will be crucial. Your role will involve SQL writing, Advance SQL tuning (preferably Oracle), and you will have the opportunity to work on batch and real-time data processing of large-scale data sets. Knowledge in SAP BO and Informatica will be advantageous as you focus on optimal use case data structure design, build, and performance tuning. You will be responsible for the ingestion, transformation, and cleansing of structured & unstructured data across a heterogeneous landscape. As a part of the team, you will need excellent problem-solving and analytical skills. Your experience in working with traditional waterfall and agile methodologies, both independently and in project/product teams under pressure and deadlines, will be valuable. Additionally, your stakeholder management skills will support use cases and data consumers effectively. Capgemini is a global business and technology transformation partner, committed to accelerating organizations" transition to a digital and sustainable world. With a diverse team of over 340,000 members in 50+ countries, Capgemini has a strong heritage of over 55 years. Trusted by clients to unlock technology's value for their business needs, Capgemini offers end-to-end services and solutions, including strategy, design, and engineering. The company's expertise in AI, cloud, and data, combined with deep industry knowledge and partner ecosystem, enables it to deliver tangible impact. In 2023, Capgemini reported global revenues of 22.5 billion.,
ACTIVELY HIRING
posted 2 months ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • survivorship
  • Snowflake
  • APIs
  • messaging platforms
  • data governance
  • data quality
  • technical documentation
  • test cases
  • Informatica MDM Cloud Edition
  • match merge rules
  • hierarchy management
  • data stewardship workflows
  • landing
  • staging
  • base objects
  • cleanse functions
  • match rules
  • trustsurvivorship rules
  • cloud data lakeswarehouses
  • Redshift
  • Synapse
  • Informatica Cloud IICS
  • data lineage
  • compliance standards
  • deployment artifacts
Job Description
As a Master Data Management Developer, your main responsibilities will include: - Designing and developing end-to-end Master Data Management solutions using Informatica MDM Cloud Edition or on-prem hybrid setups. - Implementing match & merge rules, survivorship, hierarchy management, and data stewardship workflows. - Configuring landing, staging, base objects, mappings, cleanse functions, match rules, and trust/survivorship rules. - Integrating MDM with cloud data lakes/warehouses (e.g., Snowflake, Redshift, Synapse) and business applications. - Designing batch and real-time integration using Informatica Cloud (IICS), APIs, or messaging platforms. - Working closely with data architects and business analysts to define MDM data domains (e.g., Customer, Product, Vendor). - Ensuring data governance, quality, lineage, and compliance standards are followed. - Providing production support and enhancements to existing MDM solutions. - Creating and maintaining technical documentation, test cases, and deployment artifacts. No additional details of the company were provided in the job description.,
ACTIVELY HIRING
posted 2 months ago

Informatica MDM Developer

Golden Opportunities
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • MDM
  • SQL
  • PLSQL
  • Oracle
  • data integration
  • Informatica MDM
  • Informatica Data Quality
  • Teradata
  • data modelling
  • ETL processes
Job Description
**Job Description:** As an Informatica MDM Developer, you should have over 9 years of experience with a minimum of 5 years specifically in MDM on-premises solutions. Your proficiency should include Informatica MDM components such as Hub Console, IDD, Match & Merge, and Hierarchy Manager. You are expected to have a strong knowledge of Informatica Data Quality and data profiling, along with experience in SQL, PL/SQL, and relational databases like Oracle and Teradata. Your understanding of data modelling, data integration, and ETL processes will be crucial in this role. You should have hands-on experience implementing MDM solutions in domains like Customer, Product, or Employee, and be familiar with integrating MDM systems with enterprise applications. **Key Responsibilities:** - Proficient in Informatica MDM components like Hub Console, IDD, Match & Merge, and Hierarchy Manager - Strong knowledge of Informatica Data Quality and data profiling - Experience with SQL, PL/SQL, and relational databases (Oracle, Teradata, etc.) - Implementing MDM solutions for domains such as Customer, Product, or Employee - Integrating MDM systems with enterprise applications **Qualifications Required:** - Minimum 9+ years of overall experience - Minimum 5+ years of experience in MDM on-premises solutions - Proficiency in Informatica MDM components - Strong knowledge of Informatica Data Quality and data profiling - Experience with SQL, PL/SQL, and relational databases - Understanding of data modelling, data integration, and ETL processes - Hands-on experience implementing MDM solutions for various domains - Familiarity with integrating MDM systems with enterprise applications (Note: Additional company details are not provided in the job description),
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • Python
  • SQL
  • SAS DI
  • Snowflake
  • Tableau
  • ETL development
  • AWS EMR
  • Pyspark
  • AWS S3
Job Description
As a Python Developer, you will be joining a team working with a renowned financial institution to deliver business value through your expertise. Your role will involve analyzing existing SAS DI pipelines and SQL-based transformations, translating and optimizing SAS SQL logic into Python code using frameworks like Pyspark, developing scalable ETL pipelines on AWS EMR, implementing data transformation and aggregation logic, designing modular code for distributed data processing tasks, integrating EMR jobs with various systems, and developing Tableau reports for business reporting. Key Responsibilities: - Analyze existing SAS DI pipelines and SQL-based transformations. - Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. - Develop and maintain scalable ETL pipelines using Python on AWS EMR. - Implement data transformation, cleansing, and aggregation logic to meet business requirements. - Design modular and reusable code for distributed data processing tasks on EMR clusters. - Integrate EMR jobs with upstream and downstream systems like AWS S3, Snowflake, and Tableau. - Develop Tableau reports for business reporting. Qualifications Required: - 6+ years of experience in ETL development, with a minimum of 5 years working with AWS EMR. - Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. - Proficiency in Python for data processing and scripting. - Proficient in SQL and experience with ETL tools such as SAS DI or Informatica. - Hands-on experience with AWS services including EMR, S3, IAM, VPC, and Glue. - Familiarity with data storage systems like Snowflake or RDS. - Excellent communication skills and ability to collaborate in a team environment. - Strong problem-solving skills and capability to work independently. Join us in this exciting opportunity to contribute to the success of our projects and collaborate with a team of skilled professionals in a challenging environment.,
ACTIVELY HIRING
posted 1 month ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Test Design
  • Test Planning
  • ETL Testing
  • Glue
  • API Testing
  • Test Automation
  • Informatica
  • SQL
  • AWS Services
  • Redshift
  • UnixLinux
  • ICEDQ
  • Data Pipeline Testing
Job Description
As an Internal Data Warehouse Developer & Administrator in Chennai, Tamil Nadu, you will be responsible for the following: - Designing, developing, and executing detailed test plans and test cases for ETL processes. - Conducting end-to-end data validation across various data sources such as claims, member, and provider data. - Collaborating with data engineers, business analysts, and product owners to understand ETL requirements and test accordingly. - Utilizing ICEDQ or similar tools for ETL data testing and reconciliation. - Performing SQL-based backend testing and managing test data. - Documenting test results and providing root cause analysis for data quality issues. - Participating in Agile/Scrum ceremonies and continuous integration testing cycles. To excel in this role, you should possess the following skills: - Deep knowledge in Test Design, Test Planning, and ETL Testing. - Hands-on experience with AWS Services, particularly Redshift and Glue. - Working knowledge of Unix/Linux, API Testing, and Test Automation. - Familiarity with ICEDQ, Informatica, and data pipeline testing. - Strong SQL skills and exposure to cloud-based data environments.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • coordination skills
  • communication
  • stakeholder management
  • ServiceNow
  • AWS
  • Azure
  • GCP
  • ITIL practices
  • monitoring
  • observability tools
  • CICD pipelines
  • DevOps practices
  • cloud environments
Job Description
As a Release and Change Manager with 5 to 7 years of experience in Hyderabad or Chennai, your role will involve managing end-to-end release, incident, and change management processes for enterprise applications and platforms. You should have hands-on experience in ITIL practices, strong coordination skills, and a working knowledge of monitoring and observability tools. **Key Responsibilities:** - **Release Management** - Plan, schedule, and coordinate software releases across multiple environments. - Collaborate with development, QA, and operations teams to ensure smooth deployments. - Maintain release calendar and ensure alignment with business priorities. - Conduct release readiness reviews and post-release retrospectives. - **Incident Management** - Act as the first point of contact for major incidents. - Drive incident resolution through coordination with technical teams. - Ensure timely communication to stakeholders during outages or service disruptions. - Maintain incident logs and contribute to root cause analysis. - **Change Management** - Review and approve change requests in alignment with change control policies. - Conduct impact assessments and risk evaluations. - Facilitate Change Advisory Board (CAB) meetings. - Ensure proper documentation and rollback plans are in place. - **Monitoring and Observability** - Work with monitoring tools such as Splunk, Dynatrace, AppDynamics, Prometheus, Grafana. - Set up alerts and dashboards for proactive issue detection. - Analyze logs and metrics to identify performance bottlenecks. **Qualifications Required:** - 5 to 7 years of experience in Release, Incident, or Change Management. - Strong understanding of ITIL framework. - Experience with CI/CD pipelines and DevOps practices. - Familiarity with monitoring tools and observability platforms. - Excellent communication and stakeholder management skills. - Ability to work in a fast-paced, cross-functional environment. - ITIL Foundation Certification. - Experience in cloud environments AWS, Azure, GCP. - Exposure to ServiceNow or similar ITSM platforms. As an employee of Virtusa, you will be part of a global team of 27,000 professionals who value teamwork, quality of life, and personal development. Virtusa provides exciting projects, opportunities, and exposure to state-of-the-art technologies to help you grow throughout your career. Collaboration and a dynamic team environment are key aspects of our company culture, providing you with a platform to nurture new ideas and achieve excellence.,
ACTIVELY HIRING
posted 3 weeks ago

JAVA/ J2EE Developer

TekWissen India
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • API
  • GCP
  • Java
  • Apache Struts
Job Description
As a Software Engineer III - Core Engineer I at TekWissen in Chennai, you will be responsible for designing, developing, testing, and maintaining software applications and products to meet customer needs. You will be involved in the entire software development lifecycle, including designing software architecture, writing code, testing for quality, and deploying the software to meet customer requirements. Additionally, you will be engaged in full-stack software engineering roles, developing all components of software including user interface and server-side. Key Responsibilities: - Engage with customers to deeply understand their use-cases, pain points, and requirements, showcasing empathy and advocating for user-centric software solutions. - Solve complex problems by designing, developing, and delivering using various tools, languages, frameworks, methodologies (like agile) and technologies. - Assess the requirements of the software application or service and determine the most suitable technology stack, integration method, deployment strategy, etc. - Create high-level software architecture designs that outline the overall structure, components, and interfaces of the application. - Collaborate with cross-functional teams like product owners, designers, architects, etc. - Define and implement software test strategy, guidelines, policies, and processes in line with organization vision, industry regulations, and market best practices. - Work on continuously improving performance and optimizing the application and implement new technologies to maximize development efficiency. - Support security practices to safeguard user data including encryption and anonymization. - Create user-friendly and interactive interfaces. - Develop and maintain back-end applications like APIs and microservices using server-side languages. - Evaluate and incorporate emerging technologies and capabilities to deliver solutions, monitoring and participating in solutions for new stack layers, often involving industry collaboration. Skills Required: - API - Apache Struts - GCP - Java Skills Preferred: - Jira - ETL.Informatica Experience Required: - Engineer I Exp.: 2-4 years in IT; 2 years in development 1 coding language or framework Education Required: - Bachelor's Degree, Master's Degree As an equal opportunity employer supporting workforce diversity, TekWissen Group fosters a collaborative and inclusive work environment.,
ACTIVELY HIRING
posted 2 months ago
experience12 to 16 Yrs
location
Chennai, Tamil Nadu
skills
  • Automation testing
  • ETL
  • MDM
  • SQL
  • JIRA
  • Confluence
  • data pipelines
  • Informatica MDM Cloud
  • Agile project delivery
  • Property
  • Casualty domain knowledge
  • Python scripting
Job Description
As a QA Lead with over 12 years of experience in the insurance domain, specializing in automation testing for MDM and data integration platforms, your role will involve the following: - Leading QA strategy and execution across cloud-based environments to ensure high-quality delivery and compliance with data standards. - Expertise in automation testing for ETL, MDM, and data pipelines. - Proficiency in validating Informatica MDM Cloud implementations. - Conducting SQL-based data verification and backend testing. - Experience in Agile project delivery using JIRA and Confluence. - Collaborating effectively with cross-functional teams and stakeholders. Qualifications required for this role include: - Minimum 12 years of experience in QA, with a focus on automation testing in the insurance domain. - Strong expertise in Property and Casualty Insurance. - Proficiency in Python scripting would be a nice-to-have skill. If you have the required experience and skills, and are looking to lead QA efforts in the insurance domain with a focus on automation testing, this role could be a great fit for you.,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • data governance
  • data profiling
  • data cleansing
  • data validation
  • data stewardship
  • Informatica
  • data quality management
  • data governance tools
  • data quality tools
  • AWS Glue
  • AWS Data Brew
Job Description
As a Data Governance and Data Quality Specialist at our company, your role will involve developing, implementing, and maintaining data governance frameworks and processes to ensure the accuracy, completeness, and reliability of organizational data. You will collaborate with various stakeholders to define data standards, policies, and procedures, and oversee compliance with regulatory requirements related to data management. **Key Responsibilities:** - Collaborate with cross-functional teams to define data quality standards and metrics and establish processes for monitoring and improving data quality. - Design and implement data quality assurance processes, including data profiling, cleansing, and validation, to identify and resolve data issues. - Establish data stewardship roles and responsibilities and facilitate data stewardship activities to ensure data ownership and accountability. - Conduct regular audits and assessments of data governance and data quality practices and recommend improvements based on industry best practices. - Collaborate with IT teams to implement data governance tools and technologies to support data management and compliance efforts. - Stay informed about industry trends and developments in data governance and data quality practices and recommend strategies for continuous improvement. - Serve as a subject matter expert on data governance and data quality matters and represent the organization in meetings, conferences, and industry forums. - Support data-related initiatives and projects by providing expertise and guidance on data governance and data quality requirements. **Qualifications:** - Bachelors degree in computer science, Information Systems, or related field. - 5+ years of experience in data governance, data quality management, or related field. - Strong understanding of data governance principles, data quality frameworks. - Proficiency in data profiling tools, data quality tools, and data governance platforms. - Excellent analytical, problem-solving, and communication skills. - Ability to work effectively in a cross-functional team environment and collaborate with stakeholders at all levels of the organization. - Relevant certifications from Informatica will be a plus. If you have knowledge of data governance tools like Alation, Informatica AXON, data catalogue tools such as AWS Glue catalogue, Informatica EDC, Alation, or data quality tools like AWS Data Brew, Informatica Data Quality, it will be considered an added advantage.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Data analytics
  • OBIEE
  • Business Analysis
  • Informatica
  • ODI
  • SSIS
  • Reporting
  • Dashboards
  • BI Publisher
  • Security Implementation
  • Data Flow
  • Data Visualization
  • Data Models
  • Data Modeling
  • Agile
  • SQL
  • PLSQL
  • OAC
  • Visualisation Tools
  • ETLELT Process
  • RPD Logical Modeling
  • Agents
  • SelfService Data Preparation
  • Data Sync
  • Data Sets
  • DWH Concepts
  • Waterfall SDLC
Job Description
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. **Role Overview:** As a Senior expertise in Data analytics specializing in OBIEE/OAC at EY GDS Data and Analytics (D&A), you will be responsible for creating and managing large BI and analytics solutions using Visualisation Tools such as OBIEE/OAC to turn data into knowledge. Your background in data and business analysis, analytical skills, and excellent communication abilities will be key in this role. **Key Responsibilities:** - Work as a team member and Lead to contribute to various technical streams of OBIEE/OAC implementation projects - Provide product and design level technical best practices - Interface and communicate with the onsite coordinators - Complete assigned tasks on time and provide regular status reports to the lead **Qualification Required:** - BE/BTech/MCA/MBA with adequate industry experience - 3 to 7 years of experience in OBIEE/OAC - Experience in Working with OBIEE, OAC end-to-end implementation - Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS - Knowledge of reporting, dashboards, RPD logical modeling - Experience with BI Publisher and Agents - Experience in Security implementation in OAC/OBIEE - Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets - Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC - Create pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations - Create data sets, map layers, multiple data visualization, story in OAC - Understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. - Excellent written and verbal communication skills - Cloud experience is an added advantage - Migrating OBIEE on-premise to Oracle analytics in the cloud - Knowledge and working experience with Oracle autonomous database - Strong knowledge in DWH concepts - Strong data modeling skills - Familiar with Agile and Waterfall SDLC process - Strong SQL/PLSQL with analytical skill (Note: Additional Details of the company are omitted as they are not available in the provided Job Description),
ACTIVELY HIRING
posted 2 weeks ago

Abinitio Developer

M/S. B. NANDI
M/S. B. NANDI
experience10 to 20 Yrs
Salary24 - 36 LPA
location
Chennai, Bangalore+8

Bangalore, Noida, Hyderabad, Kolkata, Amritsar, Uttar Dinajpur, Pune, Delhi, Uttarkashi

skills
  • developers
  • abinitio
  • development management
  • developer relations
  • technology evangelism
Job Description
Job Role Duties And Responsibilities. Ab Initio Developer is responsible for giving team status on a variety of projects. Their focus is to escalate an issue as necessary, assess and communicate risks to the development schedule and project to represent the data integration development teams interests in cross-functional project teams by ensuring project success as an ultimate goal. Responsibilities Monitor and Support existing production data pipelines developed in Ab Initio Analysis of highly complex business requirements, designs and/or data requires evaluation of intangible variance factors Debug daily production issues and rerun the jobs after understanding the issues Collaborate throughout the organisation on effective identification of technical issues Participates and provides feedback in design reviews Complete component design documents on assigned projects Participate and provide feedback in design reviews Requirements 1.7+ years of actual development experience building Etl applications/processes using Sas Relevant years of Hands-on experience with Ab Initio and Hadoop technologies (Hdfs, Hive, Impala etc Need to have good understanding of Etl concepts like Informatica, Data stage, Clover Etl Experience in Relational Databases like Oracle, Sql Server and Pl/Sql Understanding of Agile methodologies as well as Sdlc life-cycles and processes. Experience in writing Technical, Functional documentation Soft Skills Ability to work as an individual with minimal guidance/support Strong communication/team skills Strong analytical skills.
posted 3 weeks ago

Collibra Developer

Anlage Infotech (India) P Ltd
experience6 to 12 Yrs
location
Chennai
skills
  • Python
  • Java
  • Collibra governance
  • Catalog development
  • administration
  • API scripting
  • Workflow automation
Job Description
As a Collibra Developer at our Chennai location, you will be responsible for configuring, developing, and maintaining Collibra Data Intelligence Cloud to support enterprise data governance, cataloging, and metadata management initiatives. Your role will involve ensuring data discoverability, lineage tracking, and policy enforcement across systems. You will have the opportunity to work in a hybrid model from the client site at RMZ Millenia Business Park-II, Chennai. **Key Responsibilities:** - Configure and customize Collibra workflows, assets, and domains for metadata, glossary, and lineage management. - Integrate Collibra with enterprise systems such as Power BI, Databricks, Informatica, Purview, and data warehouses. - Automate metadata ingestion using Collibra APIs, Catalog Connector, or Edge. - Develop and maintain data stewardship workflows including policy approvals, data issue management, and exceptions. - Design and implement governance operating models with defined roles like Steward, Owner, and Custodian. - Define and enforce data classification, lineage, and compliance policies. - Collaborate with business and technical teams to ensure metadata accuracy. - Build custom dashboards and reports to track governance KPIs like data quality and compliance. - Provide user training and documentation on Collibra usage. **Qualifications Required:** - Bachelor's degree in Computer Science, Information Technology, Electronics & Communication, Information Science, or Telecommunications. - 7+ years of experience in Collibra Data Governance/Collibra Data Catalog development and administration. - Domain experience in banking and finance is an added advantage. - Experience working in Agile development environments with critical thinking and problem-solving skills. - Strong understanding of metadata management, lineage, and governance principles. - Proficiency in Collibra Console, DGC, APIs, and Workflow Designer (BPMN). - Experience with Python or Java for API scripting and workflow automation. - Knowledge of data governance tools integration like Informatica, Purview, and Power BI. - Familiarity with Azure Data Services and hybrid cloud environments. - Excellent documentation and communication skills. If you meet the above requirements and are interested in this opportunity, please share the following details with Syed at syed.m@anlage.co.in: - Total Experience: - Current CTC: - Expected CTC: - Notice Period: As a Collibra Developer at our Chennai location, you will be responsible for configuring, developing, and maintaining Collibra Data Intelligence Cloud to support enterprise data governance, cataloging, and metadata management initiatives. Your role will involve ensuring data discoverability, lineage tracking, and policy enforcement across systems. You will have the opportunity to work in a hybrid model from the client site at RMZ Millenia Business Park-II, Chennai. **Key Responsibilities:** - Configure and customize Collibra workflows, assets, and domains for metadata, glossary, and lineage management. - Integrate Collibra with enterprise systems such as Power BI, Databricks, Informatica, Purview, and data warehouses. - Automate metadata ingestion using Collibra APIs, Catalog Connector, or Edge. - Develop and maintain data stewardship workflows including policy approvals, data issue management, and exceptions. - Design and implement governance operating models with defined roles like Steward, Owner, and Custodian. - Define and enforce data classification, lineage, and compliance policies. - Collaborate with business and technical teams to ensure metadata accuracy. - Build custom dashboards and reports to track governance KPIs like data quality and compliance. - Provide user training and documentation on Collibra usage. **Qualifications Required:** - Bachelor's degree in Computer Science, Information Technology, Electronics & Communication, Information Science, or Telecommunications. - 7+ years of experience in Collibra Data Governance/Collibra Data Catalog development and administration. - Domain experience in banking and finance is an added advantage. - Experience working in Agile development environments with critical thinking and problem-solving skills. - Strong understanding of metadata management, lineage, and governance principles. - Proficiency in Collibra Console, DGC, APIs, and Workflow Designer (BPMN). - Experience with Python or Java for API scripting and workflow automation. - Knowledge of data governance tools integration like Informatica, Purview, and Power BI. - Familiarity with Azure Data Services and hybrid cloud environments. - Excellent documentation and communication skills. If you meet the above requirements and are interested in this opportunity, please share the following details with Syed at syed.m@anlage.co.in: - Total Experience: - Current CTC: - Expected CTC: - Notice Period:
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Chennai, All India
skills
  • Java development
  • SQL
  • Data analysis
  • Production support
  • Informatica MDM
  • Informatica P360
  • ActiveVOS workflow management
Job Description
As an Informatica MDM-PIM Consultant at Infoya, your role will involve designing & configuring Informatica P360 solutions, Informatica MDM, Java development, SQL, and data analysis. Your expertise in production support and ActiveVOS workflow management will be crucial for delivering successful outcomes. **Roles & Responsibilities:** - Implement and manage Informatica Product 360 (iP360) solutions across various business units - Configure and deploy P360 on AWS/Azure provisioned environment - Integrate Informatica MDM with P360 for streamlined master data management - Create and manage catalogues - Profile source data and determine source data and metadata characteristics - Design and execute data quality mappings for data cleansing and de-duplication - Develop workflows using ActiveVO designer - Utilize IDQ designer to develop mapplets and data quality rules - Extract, transform, and load data using Informatica Data Integration platform - Develop data integration mapping designs - Implement ingestion and integration processes on Cloud technologies like AWS or GCP - Build Java-based custom components for integration and automation - Perform data transformation on XML or JSON documents - Unit test process flows for accuracy - Provide production support including monitoring, incident resolution, and system optimization **Qualifications:** **Technical Experience:** - 6+ years of experience as an Informatica Developer with MDM-PIM configuration experience - Proficient in Informatica P360 and Informatica Developer Components - Knowledgeable in Informatica Data Quality standards and best practices - Experience deploying P360 solutions on AWS and/or Azure - Familiarity with Java/J2EE technologies for SDK work - Strong database technology experience with SQL databases (Oracle, SQL Server, PostgreSQL, etc.) - Experience in defining and deploying data quality programs on enterprise projects - Data analyst experience including data profiling, mapping, validation, manipulation, and analysis - Workflows development experience using Informatica ActiveVOS - Ability to translate business problems into data quality initiatives - Familiar with Data Quality technologies trends As an Informatica MDM-PIM Consultant at Infoya, you will have the opportunity to work in a dynamic and collaborative environment, with competitive salary and benefits, professional development opportunities, and the chance to collaborate with a diverse and talented team. As an Informatica MDM-PIM Consultant at Infoya, your role will involve designing & configuring Informatica P360 solutions, Informatica MDM, Java development, SQL, and data analysis. Your expertise in production support and ActiveVOS workflow management will be crucial for delivering successful outcomes. **Roles & Responsibilities:** - Implement and manage Informatica Product 360 (iP360) solutions across various business units - Configure and deploy P360 on AWS/Azure provisioned environment - Integrate Informatica MDM with P360 for streamlined master data management - Create and manage catalogues - Profile source data and determine source data and metadata characteristics - Design and execute data quality mappings for data cleansing and de-duplication - Develop workflows using ActiveVO designer - Utilize IDQ designer to develop mapplets and data quality rules - Extract, transform, and load data using Informatica Data Integration platform - Develop data integration mapping designs - Implement ingestion and integration processes on Cloud technologies like AWS or GCP - Build Java-based custom components for integration and automation - Perform data transformation on XML or JSON documents - Unit test process flows for accuracy - Provide production support including monitoring, incident resolution, and system optimization **Qualifications:** **Technical Experience:** - 6+ years of experience as an Informatica Developer with MDM-PIM configuration experience - Proficient in Informatica P360 and Informatica Developer Components - Knowledgeable in Informatica Data Quality standards and best practices - Experience deploying P360 solutions on AWS and/or Azure - Familiarity with Java/J2EE technologies for SDK work - Strong database technology experience with SQL databases (Oracle, SQL Server, PostgreSQL, etc.) - Experience in defining and deploying data quality programs on enterprise projects - Data analyst experience including data profiling, mapping, validation, manipulation, and analysis - Workflows development experience using Informatica ActiveVOS - Ability to translate business problems into data quality initiatives - Familiar with Data Quality technologies trends As an Informatica MDM-PIM Consultant at Infoya, you will have the opportunity to work in a dynamic and collaborative environment, with competitive salary and benefits, professional development opportunities, and the chance to collaborate with a diverse and talented team.
ACTIVELY HIRING
posted 2 months ago

Data Engineer ETL Informatica

AES Technologies India Pvt Limited
experience6 to 10 Yrs
location
Coimbatore, Tamil Nadu
skills
  • AWS
  • Informatica
  • ETL
  • Oracle SQL
  • Python
  • Tableau
  • Databricks
Job Description
Role Overview: As an Industry Consulting Consultant, your main role will be to manage the end-to-end migration process from Informatica PowerCenter (CDI PC) to Informatica IDMC, ensuring minimal disruption to business operations. You will also be responsible for integrating data from various sources into AWS and Databricks environments, developing ETL processes, monitoring data processing performance, implementing security best practices, and collaborating with cross-functional teams to deliver appropriate solutions. Key Responsibilities: - Manage the migration process from Informatica PowerCenter to Informatica IDMC - Create mappings, workflows, and set up Secure Agents in Informatica IDMC - Integrate data from internal and external sources into AWS and Databricks environments - Develop ETL processes to cleanse, transform, and enrich data using Databricks Spark capabilities - Monitor and optimize data processing and query performance in AWS and Databricks environments - Implement security best practices and data encryption methods - Maintain documentation of data infrastructure and configurations - Collaborate with cross-functional teams to understand data requirements and deliver solutions - Identify and resolve data-related issues and provide support - Optimize AWS, Databricks, and Informatica resource usage Qualifications Required: - Degree in Computer Science, Information Technology, Computer Engineering or equivalent - Minimum 6 years of experience in data engineering - Expertise in AWS or Azure services, Databricks, and/or Informatica IDMC - Hands-on experience in project lifecycle and implementation - Strong understanding of data integration concepts, ETL processes, and data quality management - Experience with Informatica PowerCenter, IDMC, Informatica Cloud Data Integration, and Informatica Data Engineering Integration - Proficiency in BI software such as Tableau and Oracle Analytics Server - Strong knowledge of Oracle SQL and NoSQL databases - Working knowledge of BI standard languages like Python, C#, Java, VBA - AWS Associate/AWS Professional/AWS Specialty certification preferred Additional Company Details: Advance Ecom Solutions is dedicated to bringing together professionals and employers on the TALENTMATE Portal. Whether you are looking for your next job opportunity or potential employers, they aim to provide assistance in the hiring process. Visit their website at [https://www.advanceecomsolutions.com/](https://www.advanceecomsolutions.com/) for more information.,
ACTIVELY HIRING
posted 2 months ago

ETL Testing Genrocket

Briskwin IT Solutions
experience6 to 12 Yrs
location
Chennai, Tamil Nadu
skills
  • TDM
  • ETL
  • SQL
  • Groovy
  • Core Java
  • Python
  • GenRocket
Job Description
As an ETL Testing professional with 6 to 12 years of experience, you will be responsible for developing self-service data provisioning utilities using Java, Groovy, or Python. Your role will involve performing Test Data Management functions such as Data Setup, developing methods for data provisioning, building frameworks for Data Mining and Data Reservation, and architecting TDM solutions. You will be required to provide data provisioning for Domain and Enterprise-wide applications, including Data Discovery, Data Masking, and Data Subset, as well as architecting Gold Copy databases and Data Stores for quick data provisioning. Key Responsibilities: - Develop test data design recommendations and solutions - Execute against test data management plans and test data governance - Develop test data management solutions, plans, and strategies for test data lifecycle - Manage test data discovery, analysis, generation, masking, sub-setting, validation, defect resolution, refresh, archival, and purge - Implement data masking and synthetic data generation using TDM tools like Informatica Test Data Management, Genrocket, or similar enterprise tools - Service test data requests based on assigned priorities - Write queries to validate data accuracy using SQL and API services (Elastic Search) - Possess strong working experience in Core Java, Groovy, or Python scripting - Be well-versed with all common RDBMS and Cloud data sources - Implement and support technical capabilities, repeatable processes, and best practices to support test data management Qualifications Required: - 6 to 12 years of experience in ETL Testing - Key skills in TDM, ETL, SQL, GenRocket, Groovy, and Core Java or Python - Demonstrated experience in implementing data masking and synthetic data generation using TDM tools - Strong knowledge of Core Java, Groovy, or Python scripting - Familiarity with common RDBMS and Cloud data sources Please note that additional details about the company are not mentioned in the provided job description.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter