architecture-modeling-jobs-in-madurai, Madurai

5 Architecture Modeling Jobs nearby Madurai

Toggle to save search
posted 1 month ago

BIM Coordinator

Innolink Digital Services
experience7 to 11 Yrs
location
Madurai, Tamil Nadu
skills
  • Revit
  • Navisworks
  • AutoCAD
  • BIM Coordination
  • Landscape Modeling
  • Clash Detection
Job Description
As a Senior BIM Coordinator at our Madurai location, your role involves overseeing BIM processes to ensure seamless project execution. With 7 to 10 years of experience, you are expected to possess expertise in BIM Coordination, Landscape Modeling, and Revit. Key Responsibilities: - Oversee BIM Coordination processes to ensure accuracy and completeness of project models and documentation. - Collaborate with landscape architects and design teams to integrate landscape elements within BIM models. - Utilize Revit for model creation, documentation, and coordination, ensuring adherence to project standards. - Conduct Clash Detection using Navisworks to identify and resolve conflicts for seamless construction workflows. - Facilitate project coordination meetings, effectively communicating with stakeholders for project alignment. - Maintain and update BIM models throughout the project lifecycle, adapting to changes and client requirements. - Provide technical guidance and support to project teams, enhancing overall BIM proficiency and project delivery. - Ensure project delivery within specified timelines and budget while upholding the highest quality standards. Qualifications and Skills: - Proven experience of 7-10 years in BIM Coordination, ensuring seamless project execution from inception to completion. - Strong expertise in Landscape Modeling, integrating landscape architecture within BIM models (Mandatory skill). - Proficiency in Revit for modeling, collaboration, and delivering comprehensive BIM projects (Mandatory skill). - Skilled in using Navisworks for model review, ensuring smooth integration and resolution of project components. - Experience with Clash Detection processes to identify and resolve design conflicts early in the design phase. - Proficiency in AutoCAD for drafting and detailing, complementing BIM processes for comprehensive designs. - Ability to coordinate projects efficiently, managing timelines, resources, and expectations for successful delivery. - Excellent communication and collaboration skills to liaise with multidisciplinary teams and stakeholders.,
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago

Junior Architect

Edge Architecture & Construction
experience12 to 16 Yrs
location
Madurai, Tamil Nadu
skills
  • AutoCAD
  • SketchUp
Job Description
As a Junior Architect at our company, you will be responsible for assisting in design development, 3D modeling, creating working drawings, and collaborating with consultants and site teams. Your expertise in AutoCAD and SketchUp will be crucial for the successful execution of projects. Key Responsibilities: - Assist in design development - Create 3D models using SketchUp - Generate accurate working drawings - Coordinate effectively with consultants and site teams Qualifications Required: - Bachelor's degree in Architecture - Minimum of 12 years experience in the field - Proficiency in AutoCAD and SketchUp; knowledge of rendering tools is a plus - Strong design sense and excellent teamwork skills Join our team in Madurai on a full-time basis and contribute your architectural expertise to our projects.,
ACTIVELY HIRING
posted 2 months ago

AWS Data Engineer

Techmango Technology Services
experience4 to 8 Yrs
location
Madurai, Tamil Nadu
skills
  • Python
  • indexing
  • TSQL
  • AWS services
  • SQL performance tuning
  • schema design
  • scripting automation
Job Description
Role Overview: As an AWS Data Engineer at TechMango Technology Services, you will be responsible for designing, architecting, and maintaining high-performance, scalable data pipelines and cloud data warehouses using AWS services like Redshift, Glue, S3, Lambda, and Step Functions. Your role will involve solving complex data challenges, optimizing distributed systems, and collaborating with a high-performance engineering team. Key Responsibilities: - Architect, build, and maintain robust and scalable data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, and Athena. - Design and optimize schemas for Redshift and Snowflake to support analytics, reporting, and data science requirements. - Implement efficient and reliable ETL/ELT processes for handling large volumes of structured and unstructured data. - Enforce and monitor data SLAs to ensure data freshness, reliability, and availability across environments. - Collaborate with engineering, product, and analytics teams to translate business requirements into robust data models and pipelines. - Identify and resolve bottlenecks, data quality issues, and system inefficiencies proactively. - Implement schema versioning, data lineage tracking, and database change management practices. - Define and enforce best practices for data governance, access control, observability, and compliance. - Contribute to CI/CD workflows and infrastructure as code practices using tools like CloudFormation or Terraform. Qualifications Required: - 4+ years of experience in data engineering or backend systems development with a focus on cloud-based architectures. - Proficiency in AWS data ecosystem, including Redshift, Glue, S3, Athena, Lambda, Step Functions, and CloudWatch. - Strong background in SQL performance tuning, schema design, indexing, and partitioning strategies for large datasets. - Hands-on experience with Python, T-SQL, and scripting automation for data ingestion and transformation. - Understanding of relational and dimensional data modeling, normalization, and schema evolution. - Experience with source control systems like Git, Bitbucket, and CI/CD pipelines for data infrastructure. - Track record of translating complex business requirements into scalable data solutions. - Knowledge of data governance, security, and compliance frameworks is a plus. - Familiarity with monitoring and observability tools like CloudWatch, Datadog, or Prometheus. - Bonus: Exposure to Snowflake or MSSQL in hybrid cloud environments. Additional Details about the Company: TechMango Technology Services, founded in 2014, is a leading software development company focusing on emerging technologies. They aim to deliver strategic solutions aligned with their business partners" technological needs and are recognized as the Best Offshore Software Development Company in India. Operating in the USA, UAE, and India, TechMango strives to offer high-quality and cost-efficient services while fostering long-term client relationships.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

GCP Data Engineer

Techmango Technology Services Private Limited
experience3 to 7 Yrs
location
Madurai, Tamil Nadu
skills
  • SQL
  • NoSQL
  • Google PubSub
  • Dataflow Apache Beam
  • BigQuery
  • Cloud Composer Airflow
  • Cloud Storage GCS
  • Terraform
  • CICD
Job Description
As a GCP Data Engineer/Lead/Architect at TechMango, you will play a crucial role in designing, building, and optimizing data pipelines in the Google Cloud Platform environment. Your expertise in real-time streaming data architectures will be essential in creating scalable and low-latency streaming data pipelines using tools like Pub/Sub, Dataflow (Apache Beam), and BigQuery. Key Responsibilities: - Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub, Dataflow, and BigQuery. - Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. - Collaborate with stakeholders to understand data requirements and translate them into scalable designs. - Optimize streaming pipeline performance, latency, and throughput. - Build and manage orchestration workflows using Cloud Composer (Airflow). - Drive schema design, partitioning, and clustering strategies in BigQuery for real-time and batch datasets. - Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring, Error Reporting, and Stackdriver. - Ensure robust security, encryption, and access controls across all data layers. - Collaborate with DevOps for CI/CD automation of data workflows using Terraform, Cloud Build, and Git. - Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience: - 5+ years of experience in data engineering or architecture. - 3+ years of hands-on GCP data engineering experience. - Strong expertise in Google Pub/Sub, Dataflow (Apache Beam), BigQuery, Cloud Composer (Airflow), and Cloud Storage (GCS). - Solid understanding of streaming design patterns, exactly-once delivery, and event-driven architecture. - Deep knowledge of SQL and NoSQL data modeling. - Hands-on experience with monitoring and performance tuning of streaming jobs. - Experience using Terraform or equivalent for infrastructure as code. - Familiarity with CI/CD pipelines for data workflows. TechMango offers a supportive work environment with benefits including a Badminton Ground, Free Accommodation, Cab Facility for Female Employees, Insurance, GYM, Subsidized Food, Awards and Recognition, and Medical Checkup. Join us as a GCP Data Engineer to drive data excellence and innovation using cutting-edge technology in a dynamic team environment. Apply now for this exciting career opportunity!,
ACTIVELY HIRING
posted 2 months ago
experience6 to 10 Yrs
location
Madurai, Tamil Nadu
skills
  • NET
  • C
  • MVC
  • Entity Framework
  • MS SQL
  • ASPNET
  • Web API
  • Azure DB
  • CICD
  • Azure Cloud
Job Description
As a .NET Backend Engineer (Vibe Coding First) for UK's Leading Wholesaler, you will play a crucial role in leading the development of new product initiatives and ambitious integration projects. Your responsibilities will include partnering with a Business Analyst and a UX Developer to work on product requirements, creating technical deliverables, designing and building complex backend systems, and ensuring timely and measurable execution. Your proactive attitude, coupled with an unparalleled and disciplined approach to problem-solving, will be key in adapting to a fast-paced environment. You should have a strong bias towards elegant and simple solutions that directly add value to users, be excited by "zero to one" projects, and possess the ability to efficiently communicate findings to leadership. Striving for a balance between fast delivery and scalability, you will follow backend development best practices, leverage AI-first workflows (vibe coding), and prioritize documentation. Responsibilities: - Design, develop, and maintain high-quality, scalable, service-oriented applications using .NET technologies. - Build and maintain REST APIs and public-facing APIs, ensuring security and scalability. - Collaborate closely with frontend developers to deliver seamless end-to-end solutions. - Translate requirements into robust backend systems in collaboration with UX and Business Analysts. - Write unit and integration tests to ensure code quality and maintainability. - Deploy backend applications using CI/CD pipelines and Azure Cloud services. - Utilize AI-first development workflows (vibe coding) to accelerate delivery, enhance quality, and reduce repetitive tasks. - Document technical designs, system architecture, and best practices. Requirements: - 5+ years of experience in building and maintaining scalable backend systems. - Expertise in ASP.NET (C#) and related technologies such as MVC, Web API, and Entity Framework. - Strong understanding of software architecture patterns (n-tier, microservices, SOA) and the ability to design scalable, maintainable systems. - Proficiency in data modeling, MS SQL, and Azure DB. - Experience with c processes and Microsoft Azure Cloud. - Hands-on experience with hybrid applications with offline support. - Strong sense of ownership, focusing on long-term usability and extensibility. - Proven problem-solving skills with a logical, methodical, and proactive approach. - Minimum 68 months of professional experience using vibe coding as the default backend development workflow. - Collaborative team player with flexibility to learn new tools and skillsets. Desired Skills / Experience: - Wholesale or Ecommerce industry experience. - Familiarity with other backend frameworks or cross-platform solutions.,
ACTIVELY HIRING
posted 2 months ago

Junior Engineer Civil

V-hire Consultancy
experience0 to 4 Yrs
Salary1.5 - 4.0 LPA
location
Chennai
skills
  • 2d drafting
  • interior fit-out
  • design engineering
  • 3d modeling
  • civil supervision
  • autocad
  • interior designing
  • interior architecture
  • interior works
  • revit
Job Description
A Junior Engineer Civil is responsible for assisting in the planning, design, execution, and supervision of civil engineering projects, usually under the guidance of senior engineers. The role is ideal for candidates with a Diploma or Bachelor's Degree in Civil Engineering. Key Responsibilities Assist in planning, designing, and executing civil engineering projects such as roads, buildings, drainage, and sewage systems. Prepare and review project plans, drawings, technical specifications, and documentation. Conduct site visits for inspections, measurements, and compliance assessments with safety standards and construction codes. Support senior engineers in construction supervision and project management, helping to coordinate teams and contractors. Prepare progress reports, track project schedules and budgets, and highlight issues or delays for senior review. Participate in obtaining permits, submitting applications, and ensuring regulatory compliance. Develop and evaluate cost estimates for materials, equipment, and labor to determine project feasibility. Use engineering software (AutoCAD, Civil 3D, MS Office Suite) to create designs and technical drawings. Essential Qualifications Diploma or Bachelor's Degree in Civil Engineering from a recognized institute or university. 2-4 years of experience is typical for entry-level positions; internships or site exposure preferred. Strong understanding of engineering principles, mathematics, and construction practices. Skills Required Proficiency in AutoCAD, Civil 3D, and MS Office Suite. Good communication skills and teamwork abilities. Technical writing, problem-solving, and time management. Daily Activities Drafting and revising project plans using software. Site inspections and data collection. Reviewing and reporting project progress. Assisting with project permits and documentation. This position helps bridge the gap between design and on-site execution, making it a key role on civil engineering teams.  Thanks & Regards, Venkatesh
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago
experience3 to 8 Yrs
Salary6 - 14 LPA
location
Chennai
skills
  • 2d
  • 3d modeling
  • catia v5
Job Description
Job Title: Engineer Sheet Metal & BIW Design - ITC/E/20251104/19013 Location: Chennai Experience: 3+ Years Qualification: B.E CTC Range: 11,00,000 -  17,00,000 Job Type: Full-time Status: Open Posted On: 04-Nov-2025 Job Description We are looking for an experienced Engineer Sheet Metal & BIW Design who specializes in sheet metal design for automotive components. The ideal candidate will have hands-on expertise in CATIA V5, with strong capabilities in 3D modeling and 2D drafting. Key Responsibilities Design and develop sheet metal components for automotive domains including: Upper body Closures Underbody Cargo Body fittings Create accurate 3D models and 2D drawings using CATIA V5. Develop design concepts, perform feasibility checks, and create parametric models. Finalize subsystem designs ensuring alignment with styling, manufacturing, and performance requirements. Understand and apply knowledge of BIW material types, grades, and manufacturing processes. Ensure compliance with structural performance benchmarks and homologation requirements. Collaborate with cross-functional teams including manufacturing, styling, and validation teams. Required Skills Strong proficiency in CATIA V5. Expertise in Sheet Metal Design, 3D Modeling, and 2D Drawings. Solid understanding of BIW design, metal grades, and manufacturing standards. Ability to interpret engineering specifications and deliver high-quality design outputs. Good analytical and problem-solving skills. Nice-to-Have Skills Understanding of vehicle architecture and subsystem integration. Experience in automotive OEM/Tier 1 environments. About the Role This position offers the opportunity to work on advanced automotive designs and contribute to the development of high-quality BIW components. You will work closely with design, testing, and manufacturing teams to ensure robust product delivery.
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago
experience8 to 13 Yrs
location
Chennai
skills
  • design
  • technical
  • cad
  • catia
  • combustion
  • engine
  • modeling
  • manufacturing
  • calculations
  • ugnx
  • feasibility
  • engines
  • internal
Job Description
Lead Engineer AD Gasoline Engines Design (Chennai) Role: Lead the design and development of internal combustion gasoline engines and their subsystems, ensuring engine architecture optimization, manufacturing feasibility, and timely delivery. Key Responsibilities: Develop engine designs using CAD tools like UNIGRAPHICS and CATIA, generating multiple design alternatives Perform technical calculations to validate engine performance and subsystems Finalize engine subsystems designs focusing on manufacturability and cost-effectiveness Collaborate with cross-functional teams including manufacturing, testing, and validation Ensure design adherence to quality, cost, and timeline requirements, and contribute to customer satisfaction Requirements: B.Tech in Mechanical or Automotive Engineering Expertise in engine design, internal combustion engines, technical calculations, and CAD modeling  Location: Chennai This JD highlights leadership in gasoline engine design, emphasizing strong technical skills, cross-team collaboration, and delivering feasible engine solutions aligned with project and customer goals.
INTERVIEW ASSURED IN 15 MINS
posted 2 days ago

Data Engineer

RCG Global Services
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • SQL
  • Data Modeling
  • Snowflake
  • Azure
  • Healthcare
  • Life Sciences
  • Data Analytics
  • CICD
  • Azure ADF
  • ETLELT
  • ERStudio
  • GxP Processes
  • Design Reviews
  • Code Reviews
  • Deployment Reviews
  • Medallion Architecture
  • Digital Engineering
  • Cloud Innovation
Job Description
Role Overview: As a Data Warehouse Engineer at Myridius, you will be responsible for working with solid SQL language skills and possessing basic knowledge of data modeling. Your role will involve collaborating with Snowflake in Azure, CI/CD process using any tooling. Additionally, familiarity with Azure ADF and ETL/ELT frameworks would be beneficial for this position. Key Responsibilities: - Collaborate with Snowflake in Azure and utilize CI/CD process using any tooling - Possess solid SQL language skills and basic knowledge of data modeling - Familiarity with Azure ADF and ETL/ELT frameworks - Optimize Snowflake SQL queries to enhance performance - Oversee engineers for a Senior Data Warehouse Engineer position and actively engage in the same tasks - Conduct design reviews, code reviews, and deployment reviews with engineers - Demonstrate expertise in solid data modeling, preferably using ER/Studio or an equivalent tool - Have a good understanding of Healthcare/life sciences industry and knowledge of GxP processes Qualifications Required: - Experience in ER/Studio and a good understanding of Healthcare/life sciences industry - Familiarity with medallion architecture - Knowledge of GxP processes will be a plus - Passion for driving significant growth and maintaining a competitive edge in the global market Note: Myridius is dedicated to transforming the way businesses operate by offering tailored solutions in AI, data analytics, digital engineering, and cloud innovation. With over 50 years of expertise, Myridius drives a new vision to propel organizations through rapidly evolving technology and business landscapes. The commitment to exceeding expectations ensures measurable impact and fosters sustainable innovation. Myridius co-creates solutions with clients that anticipate future trends and help businesses thrive in a world of continuous change. Join Myridius in crafting transformative outcomes and elevating businesses to new heights of innovation. Visit www.myridius.com to learn more about how Myridius leads the change.,
ACTIVELY HIRING
posted 3 weeks ago

Senior Database Architect

Strategic Ventures-in
experience7 to 11 Yrs
location
Chennai, Tamil Nadu
skills
  • database architecture
  • performance tuning
  • data modeling
  • data security
  • schema design
  • multitenant SaaS design
  • JVMbased app integration
  • cloud databases
  • CICD pipelines
Job Description
Role Overview: You will be responsible for designing and architecting the database and schema for a scalable, secure, and cloud-native Health Information System. Your main focus will be turning business and clinical workflows into a robust data foundation for the SaaS platform. Collaboration with full-stack developers and infrastructure specialists will be crucial to ensure seamless integration and deployment on the cloud. Key Responsibilities: - Design and architect the database and schema for the Health Information System - Collaborate with full-stack developers and infrastructure specialists for integration and deployment on the cloud Qualifications Required: - At least 7 years of experience in database architecture, schema design, and performance tuning with expertise in PostgreSQL, MySQL, or Oracle - Deep understanding of data modeling, multi-tenant SaaS design, and JVM-based app integration using Spring Boot and Hibernate - Experience with cloud databases such as AWS or Azure, CI/CD pipelines, and data security standards like HIPAA/GDPR - Strong knowledge of healthcare or rehabilitation data workflows is a plus If you are passionate about transforming healthcare through intelligent data architecture, we would love to meet you! Apply or connect with us to learn more.,
ACTIVELY HIRING
posted 3 weeks ago
experience3 to 15 Yrs
location
Chennai, All India
skills
  • requirements gathering
  • Enterprise Security
  • Architecture
  • Enterprise andor IT Service Management
  • ServiceNow advanced system administration
  • ServiceNow application development
  • database design schemas
  • data modeling
Job Description
As a Platform Architect, your role involves documenting the overall platform design and analyzing the impact of new requirements. Here are the key responsibilities you will be expected to fulfill: - Define, guide, and support the execution of technical governance processes. - Help develop standards and practices for maintaining the ServiceNow architecture model. - Support design and implementation of a platform operating model to achieve desired outcomes and foster end-user adoption. - Provide technical evaluation of demands against ServiceNow platform architecture, platform capabilities, and best practices. - Offer guidance on prototyping and accelerating time from design to deployment. - Advise on configuration and coding standards. - Support remediation of configurations not aligned to ServiceNow best practices. Qualifications required for this role include: - At least 3 modules experience and minimum 3 product implementation experience. - Certifications in CSA, CAD, and 2 CIS. Your ideal experience should encompass: - Enterprise Security and Architecture. - Enterprise and/or IT Service Management. - ServiceNow advanced system administration. - ServiceNow application development. - Experience with database design schemas and data modeling. - Strong requirements gathering experience. Please note that the role demands a minimum of 15+ years of experience. The location for this position is PAN India, and the shift timings are General Shift. As a Platform Architect, your role involves documenting the overall platform design and analyzing the impact of new requirements. Here are the key responsibilities you will be expected to fulfill: - Define, guide, and support the execution of technical governance processes. - Help develop standards and practices for maintaining the ServiceNow architecture model. - Support design and implementation of a platform operating model to achieve desired outcomes and foster end-user adoption. - Provide technical evaluation of demands against ServiceNow platform architecture, platform capabilities, and best practices. - Offer guidance on prototyping and accelerating time from design to deployment. - Advise on configuration and coding standards. - Support remediation of configurations not aligned to ServiceNow best practices. Qualifications required for this role include: - At least 3 modules experience and minimum 3 product implementation experience. - Certifications in CSA, CAD, and 2 CIS. Your ideal experience should encompass: - Enterprise Security and Architecture. - Enterprise and/or IT Service Management. - ServiceNow advanced system administration. - ServiceNow application development. - Experience with database design schemas and data modeling. - Strong requirements gathering experience. Please note that the role demands a minimum of 15+ years of experience. The location for this position is PAN India, and the shift timings are General Shift.
ACTIVELY HIRING
posted 1 day ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • SQL
  • Data modeling
  • Spark
  • Azure Data Factory
  • Azure Synapse Analytics
  • Azure Data Lake Storage
  • RDBMS systems
  • ELT batch design
  • Data integration techniques
  • Python programming
  • Serverless architecture using Azure Functions
  • Streaming services
  • Azure AI services
  • PowerBI
  • Data Lake house architecture
Job Description
As an Executive - SAP Support at House of Shipping, you will be responsible for designing, implementing, and managing data solutions using Azure Data services to support data-driven decision-making processes. Your main focus will be on creating robust data pipelines and architectures for efficient Extraction, Transformation, and Loading (ETL) operations from various sources into the Data warehouse or Data Lakehouse. Your key responsibilities will include: - Designing and developing data solutions on Azure - Managing data integration workflows using Azure Data Factory - Implementing data storage solutions using Azure Synapse SQL Pool and other Azure services - Monitoring and optimizing the performance of data pipelines and storage solutions Additionally, you will collaborate with Data analysts, Developers, and Business stakeholders to understand data requirements and troubleshoot data-related issues to ensure data accuracy, reliability, and availability. To succeed in this role, you should have proficiency in: - Azure Data Factory - Azure Synapse Analytics - Azure Data Lake Storage - SQL and RDBMS systems - Data modeling - ELT batch design - Data integration techniques - Python programming - Serverless architecture using Azure Functions Experience with Spark, Streaming services, Azure AI services, PowerBI, and Data Lake house architecture will be beneficial. Preferred certifications for this role include Microsoft Certified: Azure Data Engineer Associate or Microsoft Certified: Fabric Analytics Engineer Associate. If you are passionate about data solutions and thrive in a collaborative environment to deliver effective solutions, we welcome you to join our team at House of Shipping.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 12 Yrs
location
Chennai, Tamil Nadu
skills
  • Data Architecture
  • SQL
  • TSQL
  • Data Modeling
Job Description
As a Data Architect, your role involves designing scalable, high-performance, and compliant data architectures. You will be responsible for identifying opportunities to optimize cost, time, and asset utilization in complex data projects. Conducting technical audits and data architecture reviews to enhance efficiency and quality will be a key part of your responsibilities. You will define and track architectural quality metrics for enterprise data solutions. Key Responsibilities: - Design and review scalable, high-performance, and compliant data architectures - Identify opportunities for cost, time, and asset utilization optimization in complex data projects - Conduct technical audits and data architecture reviews - Define and track architectural quality metrics for enterprise data solutions In addition, you will be developing and maintaining logical and physical data models across diverse business use cases. Collaboration with stakeholders to analyze gaps, trade-offs, and dependencies is crucial. Defining systems and sub-systems aligned with enterprise architecture goals and standards will also be part of your role. Key Responsibilities: - Develop and maintain logical and physical data models - Collaborate with stakeholders to analyze gaps, trade-offs, and dependencies - Define systems and sub-systems aligned with enterprise architecture goals and standards As a technical leader, you will lead and manage a team of 15 data engineers, ensuring high-quality delivery. Providing technical mentorship, performance feedback, and career guidance to your team members will be essential. You will also drive hiring, training, and skill development initiatives. Key Responsibilities: - Lead and manage a team of 15 data engineers - Provide technical mentorship, performance feedback, and career guidance - Drive hiring, training, and skill development initiatives You will work closely with business owners, customers, and cross-functional teams to gather and define requirements, including non-functional requirements (NFRs). Collaborating with architects to assess technology landscapes, tools, and processes for the best-fit architecture strategy is another important aspect of your role. Key Responsibilities: - Work closely with business owners, customers, and cross-functional teams to gather and define requirements - Collaborate with architects to assess technology landscapes, tools, and processes - Partner with alliance partners and vendors to drive business outcomes Additionally, you will be responsible for defining technology roadmaps, analyzing cost-benefit trade-offs of different architecture solutions, and supporting project managers in identifying technical risks and mitigation strategies. Qualifications Required: - 12+ years of IT experience as a Data Engineer / Database Programmer, with 3+ years as a Data Architect - 3+ years of experience in data modeling and database design - Expert-level proficiency in SQL and T-SQL - Proven ability to manage multiple parallel projects and lead large technical teams - Deep understanding of data governance, ETL processes, and data integration frameworks - Experience defining and implementing non-functional requirements (NFRs) and system architecture - Excellent analytical, problem-solving, and communication skills Preferred Skills: - Cloud platform experience (AWS, Azure, or GCP) - Exposure to data security, compliance, and governance frameworks in healthcare - Familiarity with DevOps practices and CI/CD data pipelines,
ACTIVELY HIRING
posted 2 weeks ago
experience8 to 12 Yrs
location
Chennai, All India
skills
  • NET
  • Python
  • Java
  • AWS
  • Azure
  • microservices
  • containerization
  • orchestration
  • Docker
  • Kubernetes
  • cloud security
  • compliance
  • cost optimization
  • threat modeling
  • Nodejs
  • API gateways
  • messaging systems
  • GenAI apps
  • ML workflows
  • REST APIs
  • eventdriven architectures
  • secure SDLC practices
  • security assessments
Job Description
As a visionary Solution Architect with over 8 years of experience, you will lead enterprise-grade digital transformation projects focusing on cloud-native solutions, application modernization, GenAI integration, and team mentoring. **Key Responsibilities:** - Lead solution architecture and design for key projects, providing accurate estimates and collaborating with architects from various disciplines. - Collaborate with delivery teams, QA, and stakeholders to transform requirements into scalable, secure, and cloud-native solutions. - Define and enforce technical best practices and security guidelines while mentoring engineers and guiding teams through technical challenges. - Drive modernization initiatives including refactoring, containerization, microservices adoption, and database migration to cloud-native or NoSQL solutions. - Integrate AI/ML and GenAI components into enterprise workflows and design innovative GenAI-enabled applications. - Continuously assess and recommend tools, frameworks, and platforms to meet evolving project needs efficiently. **Qualifications Required:** - Bachelor's/Master's in Computer Science, Engineering, or related field. - Strong programming experience in .NET, Python, Java, or Node.js. - Hands-on experience with AWS or Azure, microservices, API gateways, and messaging systems. - Knowledge of GenAI apps, ML workflows, REST APIs, and event-driven architectures. - Experience with database modernization, schema optimization, migration tools, and cloud-native database platforms. - Familiarity with containerization, orchestration, and secure SDLC practices. - Certifications like AWS/Azure Solution Architect or TOGAF are desirable. - Awareness of cloud security, compliance, and cost optimization. - Ability to integrate security considerations throughout the SDLC. - Experience conducting security assessments. If you are a forward-thinking Solution Architect with a passion for driving innovation and leading modernization initiatives, this role at OptiSol is the perfect opportunity for you. Join our team and architect next-generation enterprise solutions while collaborating with expert teams and gaining leadership visibility. Apply now and become an OptiSolite! As a visionary Solution Architect with over 8 years of experience, you will lead enterprise-grade digital transformation projects focusing on cloud-native solutions, application modernization, GenAI integration, and team mentoring. **Key Responsibilities:** - Lead solution architecture and design for key projects, providing accurate estimates and collaborating with architects from various disciplines. - Collaborate with delivery teams, QA, and stakeholders to transform requirements into scalable, secure, and cloud-native solutions. - Define and enforce technical best practices and security guidelines while mentoring engineers and guiding teams through technical challenges. - Drive modernization initiatives including refactoring, containerization, microservices adoption, and database migration to cloud-native or NoSQL solutions. - Integrate AI/ML and GenAI components into enterprise workflows and design innovative GenAI-enabled applications. - Continuously assess and recommend tools, frameworks, and platforms to meet evolving project needs efficiently. **Qualifications Required:** - Bachelor's/Master's in Computer Science, Engineering, or related field. - Strong programming experience in .NET, Python, Java, or Node.js. - Hands-on experience with AWS or Azure, microservices, API gateways, and messaging systems. - Knowledge of GenAI apps, ML workflows, REST APIs, and event-driven architectures. - Experience with database modernization, schema optimization, migration tools, and cloud-native database platforms. - Familiarity with containerization, orchestration, and secure SDLC practices. - Certifications like AWS/Azure Solution Architect or TOGAF are desirable. - Awareness of cloud security, compliance, and cost optimization. - Ability to integrate security considerations throughout the SDLC. - Experience conducting security assessments. If you are a forward-thinking Solution Architect with a passion for driving innovation and leading modernization initiatives, this role at OptiSol is the perfect opportunity for you. Join our team and architect next-generation enterprise solutions while collaborating with expert teams and gaining leadership visibility. Apply now and become an OptiSolite!
ACTIVELY HIRING
posted 1 month ago
experience6 to 13 Yrs
location
Chennai, Tamil Nadu
skills
  • Leadership
  • Automation
  • Digital Transformation
  • Solution Architecture
  • Project Management
  • Communication
  • Presentation
  • NLP
  • Team Management
  • Solution Delivery
  • Onboarding
  • SDLC
  • Operating Models
  • AI
  • ClientFacing Engagements
  • OutcomeBased Pricing Models
  • Technical Concepts Articulation
  • AI Tools Machine Learning
  • RPA
  • Strategic Guidance
  • Mentorship
  • AI Certification
  • HCL Tech AI Force Platform
  • Client Proposals Alignment
  • GenAI Tools Deployment
  • Commercial Modeling
  • KPIs Monitoring
  • Benchmarks Reporting
Job Description
Role Overview: As a Solution Consultant Presales (AI) at HCLTech, you will play a crucial leadership role within the Integrated GTM team, focusing on transforming Digital Business Services (DBS) and Engineering Services (ERS). Your responsibilities will involve leading strategic pursuits, leveraging cutting-edge technologies like Generative AI (Gen AI), and delivering innovative solutions tailored to client needs. You will collaborate with practice heads and technical experts to identify growth opportunities, develop robust GTM strategies, and create persuasive Gen AI-empowered proposals. Your role will be centered on enabling end-to-end service transformation, driving industry-aligned value chain innovation, and enhancing process excellence through Gen AI. Close coordination with sales SMEs and consultants will ensure that proposals are fully aligned with business objectives. Key Responsibilities: - Lead a team of Solution Specialists, ensuring alignment with client needs and business goals. - Serve as the primary point of contact for escalations, resolving issues and ensuring smooth project delivery. - Oversee the development of customized technical solutions for clients, ensuring they align with business objectives. - Design, implement, and optimize outcome-based pricing constructs for presales proposals. - Ensure continuous upskilling of Solution Specialists in the latest AI tools and technologies. - Collaborate with sales, product, and technical teams to seamlessly integrate AI and other solutions into client proposals. - Monitor the progress of ongoing pre-sales engagements and provide regular feedback to team members. - Develop and implement strategies to enhance pre-sales activities by leveraging AI tools and technologies. - Drive internal collaboration and knowledge sharing within the presales team. Qualifications: - Educational Qualifications: MBA (Master of Business Administration) with a focus on strategy, technology management, or a related field. - Experience: Minimum 5-7 years of experience in presales consulting, solution architecture, or a related technical field, with at least 2 years in a leadership role. - Strong hands-on experience with AI technologies such as machine learning, NLP, and robotic process automation (RPA). - Proven expertise in designing and implementing outcome-based pricing models. - Experience in managing and mentoring teams, with a strong focus on performance management and coaching. - Background in client-facing roles, with the ability to manage complex engagements and propose tailored solutions. ,
ACTIVELY HIRING
posted 2 weeks ago
experience10 to 14 Yrs
location
Chennai, All India
skills
  • Snowflake
  • EMR
  • Data Modeling
  • Data Architecture
  • Java
  • Python
  • Hadoop
  • Spark
  • Effective communication skills
  • Leadership
  • Strategic thinking
  • Vendor management
  • Team management
  • LegacyOn Premise ETL solutionsPlatforms Informatica Ab Initio
  • AWS cloud services Glue
  • RDS
  • EC2
  • S3
  • SQL database design
  • Data pipeline orchestration eg
  • Apache Airflow
  • Big Data tools eg
  • Decisionmaking
  • Organizational skills
Job Description
As an experienced Data Engineer at FIS, you will have the opportunity to work on challenging financial services and technology issues. You will be part of a team that values openness, collaboration, entrepreneurship, passion, and fun. **Role Overview:** You will be responsible for designing, developing, testing, and maintaining architectures such as databases and large-scale processing systems. Your key responsibilities will include: - Identifying and resolving database performance issues, capacity, and scalability - Overseeing the design, maintenance, and use of an Extract, Transform, Load (ETL) pipeline - Developing testable, reusable code in SQL and Python - Providing technical leadership, project management, and mentoring junior engineers - Developing data set processes for data modeling, mining, and production - Recommending ways to improve data reliability, efficiency, and quality - Designing and developing scalable data pipelines using AWS services - Integrating diverse data sources and ensuring data consistency and reliability - Collaborating with data scientists and other stakeholders to understand data requirements - Implementing data security measures and maintaining data integrity - Monitoring and troubleshooting data pipelines to ensure optimal performance - Optimizing and maintaining data warehouse and data lake architectures - Creating and maintaining comprehensive documentation for data engineering processes **Qualification Required:** - Bachelor of Computer Engineering - 10+ years of relevant experience - Experience with Legacy/On Premise ETL solutions/Platforms like Informatica/Ab Initio - Experience with AWS cloud services: Glue, Snowflake, RDS, EC2, EMR, S3 - Strong experience in Data Modeling and Data Architecture - Knowledge of programming languages such as Java and Python - Hands-on experience with SQL database design - Data pipeline orchestration (e.g., Apache Airflow) - Big Data tools (e.g., Hadoop, Spark) - Effective verbal and written communication skills - Ability to represent FIS effectively to Senior Client leaders - Decision-making ability and leadership skills - Ability to think strategically and work with partners/vendors - Organized approach and ability to manage and adapt priorities Join FIS, the world's largest global provider dedicated to financial technology solutions, and be part of a team that powers billions of transactions annually. With a competitive salary, benefits, and opportunities for career development, FIS offers you a multifaceted job with high responsibility and a chance to make a difference in the financial services industry. As an experienced Data Engineer at FIS, you will have the opportunity to work on challenging financial services and technology issues. You will be part of a team that values openness, collaboration, entrepreneurship, passion, and fun. **Role Overview:** You will be responsible for designing, developing, testing, and maintaining architectures such as databases and large-scale processing systems. Your key responsibilities will include: - Identifying and resolving database performance issues, capacity, and scalability - Overseeing the design, maintenance, and use of an Extract, Transform, Load (ETL) pipeline - Developing testable, reusable code in SQL and Python - Providing technical leadership, project management, and mentoring junior engineers - Developing data set processes for data modeling, mining, and production - Recommending ways to improve data reliability, efficiency, and quality - Designing and developing scalable data pipelines using AWS services - Integrating diverse data sources and ensuring data consistency and reliability - Collaborating with data scientists and other stakeholders to understand data requirements - Implementing data security measures and maintaining data integrity - Monitoring and troubleshooting data pipelines to ensure optimal performance - Optimizing and maintaining data warehouse and data lake architectures - Creating and maintaining comprehensive documentation for data engineering processes **Qualification Required:** - Bachelor of Computer Engineering - 10+ years of relevant experience - Experience with Legacy/On Premise ETL solutions/Platforms like Informatica/Ab Initio - Experience with AWS cloud services: Glue, Snowflake, RDS, EC2, EMR, S3 - Strong experience in Data Modeling and Data Architecture - Knowledge of programming languages such as Java and Python - Hands-on experience with SQL database design - Data pipeline orchestration (e.g., Apache Airflow) - Big Data tools (e.g., Hadoop, Spark) - Effective verbal and written communication skills - Ability to represent FIS effectively to Senior Client leaders - Decision-making ability and leadership skills - Ability to think strategically and work with partners/vendors - Organized approach and ability to manage and adapt priorities Join FIS, the world's largest global provider dedicated to financial technology solutions, and be part of a team
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Chennai, All India
skills
  • JavaScript
  • Python
  • SQL
  • Go
  • REST
  • Data Mapping
  • Data Modeling
  • TypeScript
  • Data Structures
  • Algorithms
  • Large Language Models LLMs
  • GraphQL
  • Webhooks
  • Eventdrivenpubsub Architectures
  • AWS Lambda
  • Google Cloud Functions
  • CICD
  • Containers
  • Observability
  • Schema Alignment
  • Graph Data Structures
  • Model Context Programming MCP
Job Description
As a Forward Deployed Engineer at DevRev, your role involves working closely with the pre- and post-sales Customer Experience team to develop solutions over the core platform via applications written in TypeScript, JavaScript, Python, and more. A key aspect of your responsibilities will be building the Knowledge Graph, which includes integrating DevRev with other SaaS and non-SaaS systems through webhooks, APIs, and real-time communication architectures. Your creativity and expertise will be essential in shaping how AI, analytics, and workflows are utilized in customers' processes. - Customer-facing: Spend 30% of your time working directly with customers. - Coding & Integration: Allocate 70-80% of your time to hands-on technical implementation. - Build & Deploy Solutions: Design, develop, and launch AI agents, integrations, and automations connecting DevRev with customers' existing tech stacks and workflows. - Integrate Systems: Connect DevRev with SaaS and non-SaaS platforms using APIs, webhooks, and real-time communication architectures for seamless data flow. - Optimize AI Performance: Apply prompt engineering, fine-tune semantic search engines, and leverage generative AI techniques to enhance agent accuracy and user experience. - Own Data Insights: Write SQL queries, conduct data analysis, and build dashboards to provide insights driving customer decision-making. - Prototype & Iterate: Develop rapid proofs-of-concept, conduct live technical demos, and refine solutions based on customer and stakeholder feedback. - Lead Cross-functional Collaboration: Maintain communication with customers, engineering, product, customer success, support, and revenue teams to ensure alignment. - Guide Technical Adoption: Learn new tools and guide customers through critical workflows like code repository integrations and advanced configurations. - Travel Ready: Be willing to travel up to 30% for on-site implementations, technical workshops, and customer engagements. You should bring: - Experience: 3+ years in software development, systems integration, or platform engineering. Customer-facing experience is a plus. - Background: Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field. - Coding Skills: Strong proficiency in TypeScript/JavaScript, Python, data structures, and algorithms. (Nice to have: Go) - Applied AI Knowledge: Familiarity with large language models (LLMs), prompt engineering, frameworks like RAG, and function calling, and building evals to validate agentic AI solutions. - Integration Expertise: Deep experience with large-scale data synchronization, API integration patterns, and event-driven/pub-sub architectures. - Cloud & Deployment: Hands-on experience deploying on serverless and edge platforms with modern DevOps practices. - Data Transformation: Skilled in data mapping, schema alignment, and working with heterogeneous systems. Understanding of data modeling and graph data structures. - Observability: Experience implementing clear logging, actionable error surfacing, and telemetry to support faster debugging and issue resolution. - Context-Aware Systems: Familiarity with Model Context Programming for building adaptive, context-aware integrations. - Problem-Solving: Strong track record of triaging and unblocking technical and non-technical blockers. - Documentation: Experience writing concise, well-structured documentation to support long-term team understanding and seamless onboarding. - Communication: Strong written and verbal skills to articulate technical concepts to various stakeholders. As a Forward Deployed Engineer at DevRev, your role involves working closely with the pre- and post-sales Customer Experience team to develop solutions over the core platform via applications written in TypeScript, JavaScript, Python, and more. A key aspect of your responsibilities will be building the Knowledge Graph, which includes integrating DevRev with other SaaS and non-SaaS systems through webhooks, APIs, and real-time communication architectures. Your creativity and expertise will be essential in shaping how AI, analytics, and workflows are utilized in customers' processes. - Customer-facing: Spend 30% of your time working directly with customers. - Coding & Integration: Allocate 70-80% of your time to hands-on technical implementation. - Build & Deploy Solutions: Design, develop, and launch AI agents, integrations, and automations connecting DevRev with customers' existing tech stacks and workflows. - Integrate Systems: Connect DevRev with SaaS and non-SaaS platforms using APIs, webhooks, and real-time communication architectures for seamless data flow. - Optimize AI Performance: Apply prompt engineering, fine-tune semantic search engines, and leverage generative AI techniques to enhance agent accuracy and user experience. - Own Data Insights: Write SQL queries, conduct data analysis, and build dashboards to provide insights driving customer decision-making. - Prototype
ACTIVELY HIRING
posted 2 weeks ago
experience8 to 12 Yrs
location
Chennai, All India
skills
  • Data Management
  • RDBMS
  • Data modeling
  • Communication skills
  • Informatica Data Management Cloud
  • Informatica suite of products
  • Cloud Database architecture
  • Data integration patterns
  • Data Mappings
  • Leadership experience
Job Description
Role Overview: As an Informatica Data Management Cloud (IDMC) Architect/Tech Lead at Cittabase, you will play a crucial role in leading the design, implementation, and maintenance of data management solutions using the Informatica Data Management Cloud platform. Working closely with cross-functional teams, you will architect scalable data pipelines, ensure data quality and governance, and drive successful data project deliveries. Your deep expertise in Informatica IDMC, strong leadership capabilities, and proven track record in driving data initiatives to success will be key to excelling in this role. Key Responsibilities: - Lead the design and implementation of data management solutions leveraging Informatica Data Management Cloud. - Architect end-to-end data pipelines for ingestion, transformation, integration, and delivery of data across various sources and destinations. - Collaborate with stakeholders to gather requirements, define data architecture strategies, and translate business needs into technical solutions. - Provide technical leadership and guidance to a team of developers, ensuring adherence to coding standards, best practices, and project timelines. - Perform performance tuning, optimization, and troubleshooting of Informatica IDMC workflows and processes. - Stay abreast of emerging trends and technologies in data management, Informatica platform updates, and industry best practices. - Act as a subject matter expert on Informatica Data Management Cloud, participating in solution architecture discussions, client presentations, and knowledge sharing sessions. Qualifications: - Bachelor's degree in computer science, Information Technology, or related field. - 8-12 years of experience in IT specialized in Data Management (DW/Data Lake/Lakehouse). - 6-10 years of experience in Informatica suite of products such as PowerCenter / Data Engineering / CDC. - In-depth knowledge of RDBMS/Cloud Database architecture. - Must have implemented a minimum of two full lifecycle IDMC projects. - Strong understanding of data integration patterns and data modeling concepts. - Hands-on experience with Informatica IDMC configurations, Data modeling & Data Mappings. - Proven leadership experience, with the ability to mentor team members and drive collaboration. - Excellent communication skills, with the ability to effectively interact with technical and non-technical stakeholders. - Should be able to work with PM/BA and translate requirements into a working model and collaborate with developers to implement the same. - Prepare and present solution design and architecture documents. - Knowledge of Visualization/BI tools will be an added advantage. Role Overview: As an Informatica Data Management Cloud (IDMC) Architect/Tech Lead at Cittabase, you will play a crucial role in leading the design, implementation, and maintenance of data management solutions using the Informatica Data Management Cloud platform. Working closely with cross-functional teams, you will architect scalable data pipelines, ensure data quality and governance, and drive successful data project deliveries. Your deep expertise in Informatica IDMC, strong leadership capabilities, and proven track record in driving data initiatives to success will be key to excelling in this role. Key Responsibilities: - Lead the design and implementation of data management solutions leveraging Informatica Data Management Cloud. - Architect end-to-end data pipelines for ingestion, transformation, integration, and delivery of data across various sources and destinations. - Collaborate with stakeholders to gather requirements, define data architecture strategies, and translate business needs into technical solutions. - Provide technical leadership and guidance to a team of developers, ensuring adherence to coding standards, best practices, and project timelines. - Perform performance tuning, optimization, and troubleshooting of Informatica IDMC workflows and processes. - Stay abreast of emerging trends and technologies in data management, Informatica platform updates, and industry best practices. - Act as a subject matter expert on Informatica Data Management Cloud, participating in solution architecture discussions, client presentations, and knowledge sharing sessions. Qualifications: - Bachelor's degree in computer science, Information Technology, or related field. - 8-12 years of experience in IT specialized in Data Management (DW/Data Lake/Lakehouse). - 6-10 years of experience in Informatica suite of products such as PowerCenter / Data Engineering / CDC. - In-depth knowledge of RDBMS/Cloud Database architecture. - Must have implemented a minimum of two full lifecycle IDMC projects. - Strong understanding of data integration patterns and data modeling concepts. - Hands-on experience with Informatica IDMC configurations, Data modeling & Data Mappings. - Proven leadership experience, with the ability to mentor team members and drive collaboration. - Excellent communi
ACTIVELY HIRING
posted 1 week ago
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • Enterprise Architecture
  • Solution Architecture
  • Data Architecture
  • Data Modeling
  • Data Warehousing
  • Data Integration
  • Data Governance
  • Emerging Technologies
  • IT Governance
  • Compliance
  • Technology Strategies
  • Cloud Platforms
  • ProblemSolving
  • Technology Architecture Strategies
  • Technology Roadmaps
  • AIML
Job Description
As an Enterprise Architecture Leader at Industrial Systems EA, your role is crucial in driving strategic technology initiatives that align with business objectives and foster innovation across the enterprise. Your responsibilities include: - Developing and implementing enterprise-wide technology architecture strategies for Supply Chain Material flow in alignment with business goals - Translating business goals into Technology Requirements - Creating reusable reference architectures for common business capabilities to standardize technology solutions and reduce redundancy - Developing technology roadmaps and staying updated on emerging technologies and trends for innovation and competitive advantage - Building and mentoring a high-performing architecture team to deliver cutting-edge solutions - Establishing strong relationships with stakeholders across the business and IT organization to foster collaboration - Providing guidance on technology investments and solutions to business units - Driving continuous improvement in technology processes and practices for enhanced efficiency and effectiveness - Collaborating with SupplyChain IT and EA team members to optimize Material planning and flow efficiency - Participating in IT governance processes to ensure technology decisions align with the organization's strategy and risk appetite - Ensuring compliance with security and regulatory requirements in all technology initiatives - Evaluating new technologies and trends, including AI/ML, to determine potential impact on the organization and making recommendations - Supporting IT teams in transitioning from strategy to development and delivery phases, creating technical designs and documentation for new features and functionalities Qualifications required for this role: - Bachelor's degree in computer science, Information Technology, or a related field - Excellent written and verbal communication skills, with the ability to present to senior management and write clear documentation - Exceptional time management and digital proficiency, utilizing tools such as MS Office - Ability to influence and persuade others to adopt architectural recommendations - Understanding of solution architecture principles and patterns, including microservices and API first design - Experience in manufacturing industry-specific technology solutions is desirable - Deep understanding of cloud platforms (GCP, Azure, AWS) and cloud-native architectures - Strong problem-solving abilities - Proven ability to manage multiple projects concurrently Nice to have qualifications: - Experience working with Ford Plant Floor systems - Industry experience in Automotive - Certification in enterprise architecture (e.g., TOGAF) is a plus - Data Architecture Knowledge including data modeling, data warehousing, data integration, and data governance principles Please let me know if you need any more information.,
ACTIVELY HIRING
posted 1 week ago
experience10 to 14 Yrs
location
Chennai, Tamil Nadu
skills
  • Informatica PowerCenter
  • Oracle
  • DB2
  • PLSQL
  • SQL
  • Data Modeling
  • Data Warehousing
  • Performance Tuning
  • ETL
  • Data Governance
  • Customer Interaction
  • Data Architecture
  • Presales
  • Database Management
  • Warehousing
  • Client Management
  • Informatica IDMC
  • Data Quality Management
  • Cloud Platforms
  • Data Documentation
  • RealTime Data Processing
Job Description
As an Informatica Manager at EY-GDS D&A (Data and Analytics) team, your role involves overseeing data integration projects, leading a team, designing data integration architectures, developing ETL processes, and optimizing performance. You will work with Informatica PowerCenter, IDMC, Oracle, DB2, PL/SQL, and SQL technologies to deliver impactful solutions in various industries like Banking, Insurance, Manufacturing, Healthcare, Retail, Supply Chain, and Finance. **Key Responsibilities:** - **Project Management:** - Oversee planning, execution, and delivery of data integration projects using Informatica tools. - Define project scope, objectives, and deliverables in coordination with stakeholders. - **Team Leadership:** - Manage and guide a team, fostering a collaborative environment for knowledge sharing. - **Architecture and Design:** - Design and implement data integration architectures meeting business requirements. - Analyze existing data structures to design ETL processes and data warehouse structures. - **Development and Implementation:** - Create, maintain, and optimize ETL processes, stored procedures, functions, and database structures. - Perform advanced ETL development using Informatica, PL/SQL, and Oracle Database tuning. - **Performance Tuning:** - Monitor and optimize the performance of ETL processes and data workflows. - Identify and resolve performance bottlenecks to ensure efficient data movement. - **Data Quality Management:** - Implement data quality checks to ensure data accuracy and integrity. - Make recommendations for system changes based on data profiling and analysis. - **Collaboration with IT and Business Teams:** - Work closely with IT and business teams in designing and implementing the data warehouse. - Collaborate with subject matter experts to understand data behavior and characteristics. - **Documentation and Reporting:** - Use data modeling and documentation tools to analyze and present data analysis work. - Maintain documentation of data integration processes, workflows, and architecture. - **Training and Support:** - Provide training and support on Informatica tools to end-users and team members. - Act as a point of contact for troubleshooting data integration issues. - **Continuous Improvement:** - Stay updated with the latest data integration technologies and propose process improvements. **Qualifications Required:** - **Education:** BE/BTech/MCA with 10+ years of IT development experience. - **Informatica Expertise:** Proven experience in designing and implementing solutions using Informatica tools. - **Cloud Platform Experience:** Hands-on experience with public cloud platforms for data integration. - **Regulatory Knowledge:** Strong understanding of Data Governance frameworks and compliance regulations. - **Data Documentation:** Experience in producing Data Lineage Reports and Business Glossaries. - **Data Integration Experience:** 5+ years of experience in data integration solutions. - **Real-Time Data Processing:** Exposure to real-time data processing methodologies within the Informatica ecosystem. - **Customer Interaction:** Ability to manage customer interactions effectively. - **Data Management Knowledge:** In-depth knowledge of Data Architecture and Data Modeling. - **Database and Warehousing Experience:** Experience with databases, data warehousing, and high-performance computing environments. - **Presales and Presentations:** Experience in responding to RFPs and delivering customer presentations. - **Domain Experience:** Nice to have experience in Insurance and Finance domains. Join EY to work on inspiring projects, receive support and coaching from engaging colleagues, and have opportunities for skill development and career progression in a collaborative environment focused on high quality and knowledge exchange.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter