data-flow-jobs-in-mysore, Mysore

3 Data Flow Jobs nearby Mysore

Toggle to save search
posted 1 week ago
experience4 to 8 Yrs
location
Mysore, Karnataka
skills
  • AutoCAD
  • Continuous Improvement
  • Process Improvement
  • Quality Control
  • FMEA
  • Poka Yoke
  • Engineering Drawings
  • Root Cause Analysis
  • Communication Skills
  • Lean Methodologies
Job Description
As a Process Expert at our company, you will play a crucial role in leading operations and quality initiatives across the entire plant, including board machines (M1 & M2), paper production, and component areas. Your focus will be on driving continuous improvement, optimizing in-process quality control, and supporting expansion and standardization. **Key Responsibilities:** - Develop and implement new manufacturing systems and processes to support product integration and operational improvements. - Collaborate with cross-functional teams to prepare and maintain essential process documentation, including process operation charts, control plans, process flow charts, SOPs, tooling lists, and production capacity studies. - Design and install new equipment and tooling for production lines using AutoCAD and other design tools. - Monitor and improve key performance indicators such as Overall Equipment Effectiveness (OEE), SPC studies, etc. - Conduct time and motion studies, cycle time analysis, and line balancing to optimize throughput and reduce manufacturing costs. - Lead process improvement initiatives using methodologies such as 6S, Kaizen, and Value Stream Mapping (VSM). - Analyse customer complaints and implement effective Corrective and Preventive Actions (CAPA) in machining and assembly processes. - Continuously improve process quality and reduce non-value-added activities in Board, paper, and component manufacturing units. - Organize plant start-up and shutdown schedules to minimize production loss. Respond to equipment breakdowns and report downtime trends. - Undertake special projects and contribute to ongoing improvement efforts. Perform root cause analysis and resolve technical problems. - Drive process optimization and standardization across the plant. Champion in-process quality control (IPQC) and ensure adherence to quality standards. - Create and maintain engineering drawings, engineering orders, and Engineering Change Notices (ECNs). - Ensure timely updates and accuracy of all engineering data within the Product Lifecycle Management (PLM) software. **Qualifications Required:** - Full-time BE/B. Tech in Mechanical/Production/Industrial Engineering or B.Sc. in Paper Technology. - Minimum relevant work experience of 4 to 6 Years. - Proficiency in using practically in projects on AutoCAD and PLM software. - Strong knowledge of continuous process manufacturing systems, tooling design, and lean methodologies. Knowledge of pulp and paper manufacturing processes is an advantage. - Experience with FMEA, Poka Yoke, and continuous improvement practices. - Excellent analytical, problem-solving, and communication skills. In addition, you will be responsible for living Hitachi Energy's core values of safety and integrity and ensuring compliance with applicable external and internal regulations, procedures, and guidelines. If you are a qualified individual with a disability requiring accessibility assistance or accommodation during the job application process, you may request reasonable accommodations by completing a general inquiry form on our website. Please include your contact information and specific details about your required accommodation to support you during the job application process.,
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago
experience10 to 15 Yrs
location
Mysore, Karnataka
skills
  • HTML
  • CSS
  • JavaScript
  • Google Analytics
  • data privacy
  • UIUX design
  • AIdriven design principles
  • AI technologies
  • AI specialists
  • data scientists
  • product managers
  • AIpowered tools
  • Figma
  • Sketch
  • Adobe XD
  • InVision
  • user research methodologies
  • frontend development
  • Hotjar
  • Crazy Egg
  • AI ethics
Job Description
As an experienced Product Manager - UI/UX with expertise in AI-driven design principles, your role at MADTECH.AI will be highly collaborative and technical. You will lead the enhancement of the user experience for the AI-powered B2B SaaS platform by combining advanced UI/UX design knowledge with technical understanding of AI technologies. Your work will involve collaborating with cross-functional teams to drive innovation through cutting-edge design. Key Responsibilities: - AI-Enhanced Design Strategy: - Lead the integration of AI into the design process using data-driven design, AI-powered personalization, and predictive analytics for optimizing user experiences. - Collaborative Product Management: - Work closely with AI and data science teams to incorporate intelligent features like natural language processing (NLP) and machine learning (ML) into the design process. - User-Centered Research & Analysis: - Conduct comprehensive user research, including AI-assisted sentiment analysis and heatmap tracking, to inform the design strategy. - Prototyping with AI Insights: - Utilize advanced AI-based prototyping tools to simulate user interactions and incorporate user feedback and behavioral data into design iterations. - Iterative Design Process: - Leverage AI tools for continuous testing, including A/B testing and multivariate testing, to refine user flows and high-fidelity mockups based on real-time data. - Cross-Functional Collaboration: - Partner with AI engineers, data scientists, and QA teams to ensure design solutions align with technical capabilities of AI features. - AI-Driven Personalization: - Develop and implement AI-driven UI elements that personalize user experience based on data insights to improve user engagement and retention. - Data Analytics & User Testing: - Use AI tools for advanced usability testing and analyze data to optimize the UI/UX design. - AI Ethics & Accessibility: - Ensure AI features adhere to ethical standards, enhance transparency, inclusivity, and comply with WCAG guidelines for accessible design. - Documentation & Reporting: - Present data-backed design insights to stakeholders and utilize AI analytics dashboards to explain user behavior and design impact effectively. Qualifications and Skills: - Bachelors or Masters in Engineering, IT, or related field. - Advanced coursework or certifications in AI, machine learning, or data science are a plus. - 10-15 years of experience in UI/UX design with at least 4 years working on AI-driven products, preferably in B2B SaaS or enterprise-level applications. - Proficiency in Figma and experience with design tools like Sketch, Adobe XD, and InVision. Experience with AI-based design tools is an added advantage. - Deep knowledge of user research methodologies, usability testing, and collaboration with AI specialists, data scientists, and engineers. - Strong communication skills for presenting AI-driven design insights and strategies to stakeholders clearly. - Understanding of AI models and their integration into UX/UI, along with familiarity with front-end development technologies. - Knowledge of AI ethics, data privacy, and regulatory standards in AI-driven product design. - Willingness to work from Mysore/Bangalore office.,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Mysore, Karnataka
skills
  • system configuration
  • software integration
  • Excel
  • SQL
  • BI tools
  • leadership
  • critical thinking
  • communication skills
  • data handling
  • REST APIs
  • cloud platforms
  • professional skills
  • problemsolving
Job Description
As a Lead Software Implementation Analyst at our company, you will play a crucial role in anchoring the implementation of Capillary's healthcare loyalty platform. Your responsibilities will involve a combination of hands-on configuration tasks and leadership duties, including managing analysts, driving structured execution, and ensuring high-quality delivery at scale. You will have the opportunity to work closely with large enterprise clients, guiding them through complex implementations while also mentoring a team of implementation analysts. - **Team & Project Leadership:** - Lead a small team of analysts to execute multiple concurrent client implementations efficiently and with accountability. - Set up structured implementation plans, define roles and responsibilities, and ensure adherence to delivery milestones. - Foster a culture of continuous improvement, ownership, and knowledge sharing within the team. - **Client Engagement & Delivery Excellence:** - Act as the primary implementation lead for high-value or complex clients. - Collaborate with senior client stakeholders to understand objectives, align expectations, and drive successful adoption. - Proactively address client escalations, provide solutions, and ensure a positive experience across the onboarding journey. - **Configuration & Solution Design:** - Translate business requirements into scalable configurations using Capillary's proprietary tools and APIs. - Validate end-to-end system behavior through testing and ensure robustness, compliance, and accuracy in the final delivery. - **Data Integration & Quality Management:** - Oversee data ingestion, cleansing, mapping, and validation for loyalty program launches. - Partner with Engineering and QA teams to troubleshoot data or logic issues across different environments. - **Operational Discipline & Documentation:** - Drive clear and auditable documentation of configuration logic, timelines, and decision points. - Implement internal checklists and review frameworks to ensure delivery quality and consistency. - **Scalability & Continuous Improvement:** - Identify opportunities to automate or templatize implementation components. - Influence improvements to tooling, processes, and internal training based on learnings from delivery cycles. **Required Qualifications:** - **Technical Skills:** - Strong foundation in system configuration, data handling, and software integration (REST APIs, data flows, etc.). - Comfortable working with data in Excel, SQL, or BI tools like Power BI, Looker, etc. - Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data architecture. - **Leadership & Professional Skills:** - Proven ability to lead and grow small teams while maintaining high delivery standards. - Demonstrated experience in managing large or complex client implementations in a SaaS/product setup. - Strong critical thinking, structured problem-solving, and communication skills. - Ability to operate with clarity in high-pressure or fast-paced environments. - **Experience & Education:** - 5-8 years of experience in software implementation, configuration, or solution delivery roles. - At least 2 years in a lead role, managing people or leading enterprise-scale implementation efforts. - Bachelor's degree in Engineering, Computer Science, Information Systems, or a related technical discipline.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 1 week ago

Data Engineer

MEEDEN LABS PRIVATE LIMITED
experience5 to 8 Yrs
Salary12 - 24 LPA
location
Bangalore
skills
  • aws
  • python
  • pyspark
Job Description
Key Skills: Strong experience in Python Hands-on with PySpark Expertise in ETL processes and data pipeline development Experience working on any Cloud platform (AWS or Azure) Interview Process: Level 1 (L1): Technical Interview Level 2 (L2): Technical Interview Executive Leadership (EL): Final round In-person Responsibilities: Design, build, and maintain scalable data pipelines and ETL workflows Work with large datasets to ensure efficient data flow and quality Collaborate with cross-functional teams to support data-driven solutions Optimize performance of data processing systems    Please revert back with your confirmation
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago

GCP Data Engineer

Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
experience5 to 9 Yrs
Salary10 - 20 LPA
location
Hyderabad, Chennai+2

Chennai, Bangalore, Kochi

skills
  • devops
  • gcp
  • python
  • etl
  • docker
  • kubernates
  • bigquery
  • terraform
  • dataflow
Job Description
Job Title: GCP Data Engineer Employment Type: Permanent Industry of the Employer: IT / Software Services Department / Functional Area: Data Engineering, Cloud Engineering Job Description GREETINGS FROM HUCON Solutions Hiring for Leading MNC GCP Data Engineer Role: GCP Data Engineer Skills: GCP, Python, SQL, BigQuery, ETL Data Pipeline Experience: 59 Years Locations: Chennai, Kochi, Bangalore, Hyderabad, Pune Eligibility Criteria Strong hands-on experience in Python, Pandas, NumPy Expertise in SQL and solid understanding of Data Warehousing concepts In-depth knowledge of GCP Services including: BigQuery, Cloud Run, Pub/Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Functions Experience with Docker, Kubernetes, GitHub Strong communication and analytical skills Working experience with PySpark, Data Modeling, Dataproc, Terraform Google Cloud Professional Data Engineer certification is highly preferred Role Responsibilities Design, develop, and manage robust and scalable ETL / ELT pipelines on GCP using Dataflow, Cloud Run, BigQuery, Pub/Sub, and Cloud Composer Implement data ingestion workflows for structured and unstructured data sources Build and maintain data warehouses and data lakes using BigQuery, Cloud Storage, and other GCP tools Work with data modeling, architecture, and ETL/ELT frameworks Ensure best practices in performance, optimization, and cloud data management Keywords / Skills GCP, Python, SQL, BigQuery, ETL, Data Pipeline, Data Engineering, Cloud Run, Pub/Sub, Cloud Storage, Dataproc, Terraform, Kubernetes, Docker, Dataflow Total Experience: 5 to 9 Years Salary Type: Yearly Annual Salary Offered: As per company standards Job Type: Full Time Shift Type: Day Shift / Rotational (based on project requirement) Location of the Job: Chennai | Kochi | Bangalore | Hyderabad | Pune
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago

GCP Big Data Engineer

Acme Services Private Limited
experience6 to 11 Yrs
Salary16 - 28 LPA
location
Bangalore, Gurugram+2

Gurugram, Pune, Mumbai City

skills
  • spark
  • gcp
  • airflow
  • data
  • scala
  • big
  • query
Job Description
Job Description  We are seeking a seasoned GCP Data Analytics professional with extensive experience in Big Data technologies and Google Cloud Platform services to design and implement scalable data solutionsDesign develop and optimize data pipelines using GCP BigQuery Dataflow and Apache Airflow to support largescale data analytics Utilize the Big Data Hadoop ecosystem to manage and process vast datasets efficiently Collaborate with crossfunctional teams to gather requirements and deliver reliable data solutions Ensure data quality consistency and integrity across multiple data sources Monitor and troubleshoot data workflows to maintain high system availability and performance Stay updated with emerging trends and best practices in GCP data analytics and big data technologiesRoles and Responsibilities  Implement and manage ETL processes leveraging GCP services such as BigQuery Dataflow and Airflow Develop.  Scalable maintainable and reusable data pipelines to support business intelligence and analytics needs.  Optimize SQL queries and data models for performance and cost efficiency in BigQuery.  Integrate Hadoop ecosystem components with GCP services to enhance data processing capabilities  Automate workflow orchestration using Apache Airflow for seamless data operations  Collaborate with data engineers analysts and stakeholders to ensure alignment of data solutions with organizational goals  Participate in code reviews testing and deployment activities adhering to best practices   Mentor junior team members and contribute to continuous improvement initiatives within the data engineering team Mandatory Skills :    GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem
INTERVIEW ASSURED IN 15 MINS
posted 2 days ago
experience5 to 9 Yrs
location
Karnataka
skills
  • IBP
  • Incident Management
  • Project Management
  • Microsoft Office
  • ERP
  • Supply Chain Impact Assessments
  • SAP S4
  • Master Data Governance
  • SCM Product Flow
  • Business Scenarios
  • Packaging Materials Process
  • Data Quality Management
  • Deviations
  • CAPAs
  • SAP S4
  • rfXcel
Job Description
Role Overview: As a Supply Chain Master Data Specialist at Astellas, your primary responsibility will be to conduct supply chain impact assessments for master data in global systems such as SAP S/4 and IBP. You will represent the master data team in various projects, create and maintain global data rules, and possess the capability to train other Supply Chain Management (SCM) colleagues. Key Responsibilities: - Perform the roles of Technical Data Steward (TDS) as per Astellas master data governance policies and process designs. - Evaluate incoming Impact Assessments (IA) and communicate relevant impacts to the SCM Master Data Management (MDM) team, collaborating with stakeholders to find solutions when necessary. - Understand SCM product flows and communicate effectively with SCM MDM colleagues, overseeing the impact of product flows on master data and internal documentation. - Have a thorough understanding of business scenarios, making decisions based on impact assessments and instructing team members on creating new business scenarios, documentation procedures, and standards. - Familiarize yourself with procedures and systems related to packaging materials processes, providing support to team members. - Manage data quality by ensuring adherence to data standards, identifying the need for new or updated data rules, and overseeing the implementation of these rules. - Handle data quality incident management by determining root causes of data-related incidents, establishing action plans, and managing the resolution process. - Take ownership of deviations and Corrective and Preventive Actions (CAPAs) as part of the resolution process. - Identify potential candidates for deactivation, lead the deactivation process, and ensure regular deactivation procedures are followed. - Represent the MDM team in projects, participate in project activities, and escalate issues when necessary. - Vet incoming change requests, assess their validity, and communicate outcomes to the requestor. Qualifications Required: - Bachelor's degree preferred. - Develop and maintain master data management models, tools, and methods to enhance data quality. - Strong conceptual thinker with a deep understanding of supply chain flows and their relationship with SCM master data. - Stay updated on trends in master data governance and apply relevant knowledge in the MDM context. - Capable of delivering on project objectives, managing stakeholders, and leading teams. - Hands-on knowledge of operational processes in ERP systems, preferably in the Pharma domain. - Ability to lead people and teams, contribute to team development, and participate in job interviews and appraisals. - Lead the development of knowledge management and data management in the area of master data. Additional Company Details: Astellas Global Capability Centres (GCCs) in India, Poland, and Mexico play a vital role in enhancing operational efficiency, resilience, and innovation potential. These GCCs are integral parts of Astellas, guided by shared values and behaviors, supporting the company's strategic priorities and commitment to delivering value to patients. Astellas is dedicated to promoting equality of opportunity in all aspects of employment, including Disability/Protected Veterans.,
ACTIVELY HIRING
posted 3 weeks ago
experience6 to 13 Yrs
location
Karnataka
skills
  • Google Cloud Platform
  • Apache Spark
  • Python
  • Hadoop
  • SQL
  • Cloud Storage
  • Machine Learning
  • Java
  • Relational Databases
  • Apache Beam
  • Google Dataflow
  • Big Query
  • Big Table
  • Datastore
  • Spanner
  • Cloud SQL
  • Analytical Databases
  • NoSQL databases
Job Description
As a GCP Data Engineer, you will be responsible for implementing and architecting solutions on Google Cloud Platform using components of GCP. Your expertise in Apache Beam, Google Dataflow, and Apache Spark will be crucial in creating end-to-end data pipelines. Your experience in technologies like Python, Hadoop, Spark, SQL, Big Query, and others will play a vital role in delivering high-quality solutions. Key Responsibilities: - Implement and architect solutions on Google Cloud Platform - Create end-to-end data pipelines using Apache Beam, Google Dataflow, and Apache Spark - Utilize technologies like Python, Hadoop, Spark, SQL, Big Query, Cloud Storage, and others - Program in Java, Python, and other relevant languages - Demonstrate expertise in relational databases, analytical databases, and NoSQL databases - Identify downstream implications of data loads/migration - Automate data ingestion, transformation, and augmentation through data pipelines - Provide best practices for pipeline operations - Enable simplified user access to massive data by building scalable data solutions - Utilize advanced SQL writing and data mining techniques - Work in a rapidly changing business environment Qualifications Required: - 6-13 years of experience in IT or professional services with a focus on IT delivery or large-scale IT analytics projects - Expert knowledge of Google Cloud Platform; knowledge of other cloud platforms is a plus - Proficiency in SQL development - Experience in building data integration and preparation tools using cloud technologies - Certification in Google Professional Data Engineer/Solution Architect is advantageous - Ability to work with a variety of technologies including Java, Python, and others Please note that the additional details of the company were not provided in the job description.,
ACTIVELY HIRING
posted 2 days ago

Agentic cloud Data Engineer (AWS)

Sustainability Economics.ai
experience2 to 6 Yrs
location
Karnataka
skills
  • Glue
  • EMR
  • Python
  • SQL
  • data engineering
  • AIdriven data pipelines
  • agentic databases
  • orchestration frameworks
  • AWS services
  • Lambda
  • data modelling
  • integrating thirdparty APIs
  • data scraping
  • schema design
  • relational database concepts
  • SQL query optimization
  • integrating APIs
  • AI components
Job Description
As a skilled and experienced Agentic Data Engineer at Sustainability Economics.ai in Bengaluru, Karnataka, you will be responsible for designing intelligent, scalable, and adaptive data architectures that support next-generation AI-based systems and applications. Your role will involve building adaptive data pipelines, optimizing data models, integrating AI-driven components, and working with cross-functional teams to turn business requirements into technical solutions. Here's what you can expect in this role: Key Responsibilities: - Design, develop, and maintain AI-driven, agentic data pipelines and intelligent end-to-end architectures. - Build and optimize data models and schemas to adapt to evolving business needs. - Integrate AI-driven components and advanced data workflows into production systems. - Implement scalable pipelines on AWS using services such as S3, Glue, EMR, Lambda, and DynamoDB. - Develop and manage ETL/ELT pipelines, integrating third-party APIs and diverse data sources. - Apply best practices in data governance, lineage, and observability with modern monitoring tools. - Optimize SQL queries, storage, and throughput to improve efficiency across data systems. - Document pipelines, data flows, and transformations for clarity and maintainability. - Explore emerging tools in agentic systems, orchestration frameworks, and AWS for continuous improvement. Key Requirements: - Bachelors degree in Computer Science, Data Engineering, or a related field. - Minimum of 2 years of experience in data engineering with exposure to AI-driven or agentic systems. - Proficiency in AWS services like S3, Glue, EMR, Lambda, DynamoDB, and Postgres. - Hands-on experience in data scraping, data modeling, schema design, and relational database concepts. - Strong knowledge of SQL, including performance tuning and query optimization. - Prior experience in integrating APIs and AI components into data pipelines. What You'll Do: - Build adaptive data pipelines that can self-optimize and scale with business needs. - Develop systems supporting multimodal data (text, geospatial). - Enable real-time transformations in pipelines using agentic architectures. - Contribute to the evolution of agentic databases and autonomous data workflows. What You Bring: - Passion for AI, agentic architectures, and next-gen data systems. - Strong problem-solving skills and a drive for innovation. - Ability to collaborate across engineering, product, and data science teams. - Agile mindset, adaptability, and eagerness to learn emerging technologies. In summary, at Sustainability Economics.ai, you will have the opportunity to work on a first-of-its-kind AI + clean energy platform, collaborate with a mission-driven team, and contribute to the intersection of AI, design, and sustainability.,
ACTIVELY HIRING
posted 1 week ago
experience8 to 12 Yrs
location
Karnataka
skills
  • Data Engineering
  • Data Quality
  • Automation
  • Data Governance
  • Analytical Skills
  • Interpersonal Skills
  • Communication Skills
  • Relationship Building
  • Financial Transactions
  • Controls Development
  • Alteryx
  • Data Validation
  • Data Monitoring
  • Data Solutions
  • Root Cause Analysis
  • Machine Learning
  • Operational Data Management
  • AI
  • NextGeneration Technologies
  • Automation Tooling
  • Pro Dev Capabilities
  • AI Strategies
  • Client Relationships
  • PowerBI
  • GenAI
  • Exception Processes
  • Data Flows
  • Functional Data Strategies
  • Exception Reporting
  • Workflow Building
  • Global Alignment
  • Regulatory Standards
  • ProblemSolving
  • Sustainable Solutions
  • Reconciliation Utilities
Job Description
You will be leading a global team to ensure the accuracy, completeness, and transparency of reference and transactional data supporting Transaction Lifecycle Management at a leading Global Asset Manager. Your responsibilities will include: - Managing and remediating data quality issues across middle office platforms, ensuring accuracy, timeliness, and completeness of reference and transactional data. - Developing tools for the operations team using industry-leading technologies like Alteryx, PowerBI, GenAI, etc., to pursue automation and enhance efficiency. - Maintaining robust data validation and exception processes with clear monitoring and escalation. - Overseeing input/output data flows across platforms, collaborating with architecture, design, and control teams to support front-to-back transaction lifecycle processes. - Developing and executing functional and outcome-focused data strategies, ensuring data solutions are fit for purpose and aligned with business needs. - Providing full transparency into operational data quality, including readiness for downstream use, by implementing tiered monitoring and exception reporting. - Enabling high-touch data solutions to address complex requirements, supporting the build of more data-enabled workflows and active monitoring. - Driving an end-to-end product mindset across the team, ensuring data management processes support seamless trade lifecycle execution. - Proactively identifying inefficiencies, re-engineering workflows, and applying AI/automation to streamline controls and reduce manual intervention. - Developing and operationalizing data quality controls and embedding them into production workflows. - Partnering with the enterprise Data Strategy & Governance team to ensure global alignment with data standards, frameworks, and policies. - Leading, mentoring, and growing a global team, fostering a culture of accountability, innovation, and continuous improvement. Qualifications required for this role: - 8+ years of experience in Data Engineering, Data Quality, or Transaction Lifecycle Operations within Asset Management, Investment Banking, or related financial institutions. - Strong expertise in middle office data processes, including trade, reference, and transactional data management. - Proven ability to lead global teams and build a culture of operational excellence and innovation. - Experience with data engineering tools, data quality platforms, and reconciliation utilities. - Understanding of enterprise data management principles, governance, and regulatory standards. - Strong problem-solving and analytical skills, with the ability to identify root causes and design sustainable solutions. - Demonstrated track record of applying automation, AI, or machine learning in data workflows. - Excellent communication skills with the ability to collaborate across Technology, Operations, and Data Governance. - Bachelors or Masters degree in Computer Science, Engineering, Finance, or related field. Morgan Stanley, a global leader in financial services since 1935, is committed to maintaining excellence and diversity in serving clients and communities worldwide. They offer opportunities to work alongside a diverse and talented workforce, providing attractive employee benefits and perks. Morgan Stanley is an equal opportunities employer, fostering a culture of inclusion and supporting individuals to maximize their full potential.,
ACTIVELY HIRING
posted 5 days ago
experience8 to 12 Yrs
location
Karnataka
skills
  • Data Governance
  • Metadata Management
  • AWS
  • Azure
  • Snowflake
  • SQL
  • Data Catalog
  • Cloud Data Profiling
  • CrossSystem Lineage
  • ETL processes
  • Property Casualty Insurance domain
Job Description
As an experienced Sr. Developer with 8 to 12 years of experience, you will be responsible for contributing to enhancing the data infrastructure, ensuring seamless data flow and accessibility, and supporting the Property & Casualty Insurance domain. Your key responsibilities will include: - Leading the design, development, and implementation of data governance frameworks to ensure data quality and compliance. - Overseeing Metadata Management processes to maintain accurate and consistent metadata across systems. - Providing expertise in Data Catalog and Manage Metadata Service to enhance data discoverability and usability. - Implementing Cloud Data Profiling to assess data quality and identify areas for improvement. - Developing and managing Cross-System Lineage to ensure data traceability and transparency. - Utilizing AWS and Azure platforms to optimize data storage and processing capabilities. - Integrating Snowflake solutions to enhance data warehousing and analytics capabilities. - Designing and implementing ETL processes to ensure efficient data extraction, transformation, and loading. - Utilizing SQL for data querying and manipulation to support business intelligence and reporting needs. - Collaborating with cross-functional teams to align data management strategies with business objectives. - Ensuring data security and privacy compliance in all data management activities. - Providing technical guidance and support to team members to foster a collaborative and innovative work environment. - Contributing to the continuous improvement of data management practices to drive organizational success. Qualifications: - Strong understanding of data governance principles and practices. - Expertise in Metadata Management and related tools. - Experience with Data Catalog and Manage Metadata Service for data management. - Proficiency in Cloud Data Profiling and Cross-System Lineage techniques. - Proficiency in AWS, Azure, and Snowflake platforms for data solutions. - Solid understanding of ETL processes and SQL for data manipulation. - Experience in the Property & Casualty Insurance domain is a plus.,
ACTIVELY HIRING
posted 3 weeks ago
experience3 to 12 Yrs
location
Karnataka
skills
  • Bigdata
  • GCP
  • SQL
  • Spark
  • Dataproc
  • Dataflow
Job Description
As a candidate for the role, you will be responsible for handling Bigdata and GCP related tasks. Your key responsibilities will include: - Working with Bigdata technologies such as Dataproc, Dataflow, SQL, and Spark - Utilizing GCP for efficient data processing and analysis It is essential that you possess the following qualifications: - Proficiency in Bigdata and GCP technologies - Experience with Dataproc, Dataflow, SQL, and Spark Please note that the job reference number for this position is 13154.,
ACTIVELY HIRING
posted 4 days ago

GCP Data Architect

Orion Consulting
experience5 to 9 Yrs
location
Karnataka
skills
  • Data Modeling
  • Data Governance
  • Performance Optimization
  • Collaboration
  • Innovation
  • Programming
  • Communication
  • GCP Cloud Architecture
  • Big Data Processing
  • Data Pipelines
  • GCP Core Services
  • Big Data Technologies
  • Cloud Architecture
  • ProblemSolving
Job Description
As a Cloud Data Architect at our company, you will be responsible for designing, implementing, and managing robust, scalable, and cost-effective cloud-based data architectures on Google Cloud Platform (GCP). Your expertise will be crucial in leveraging services like BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. Additionally, experience in designing cloud architectures on Oracle Cloud is considered a plus. Key Responsibilities: - Develop and maintain conceptual, logical, and physical data models to support various business needs. - Design and implement solutions for processing large datasets using technologies such as Spark and Hadoop. - Establish and enforce data governance policies, including data quality, security, compliance, and metadata management. - Build and optimize data pipelines for efficient data ingestion, transformation, and loading. - Monitor and tune data systems to ensure high performance and availability. - Work closely with data engineers, data scientists, and other stakeholders to understand data requirements and provide architectural guidance. - Stay current with the latest technologies and trends in data architecture and cloud computing. Qualifications: - In-depth knowledge of GCP data services, including BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, Cloud DataProc, Cloud Run, and Cloud Composer. - Expertise in data modeling techniques and best practices. - Hands-on experience with Spark and Hadoop. - Proven ability to design scalable, reliable, and cost-effective cloud architectures. - Understanding of data quality, security, compliance, and metadata management. - Proficiency in SQL, Python, and DBT (Data Build Tool). - Strong analytical and problem-solving skills. - Excellent written and verbal communication skills. - A Bachelor's degree in Computer Science, Computer Engineering, Data or related field, or equivalent work experience required. - GCP Professional Data Engineer or Cloud Architect certification is a plus. Please note that this is a full-time position and the work location is in person.,
ACTIVELY HIRING
posted 1 day ago

IT Data Analyst

NetConnect Private Limited
experience4 to 12 Yrs
location
Karnataka
skills
  • Power BI
  • SQL
  • Python
  • Data Modeling
  • ETL Processes
  • Data Warehousing Concepts
Job Description
As an IT Data Analyst/Engineer with 4 to 12 years of experience, you will be joining the data team in Bangalore or Pune. Your role is crucial in leveraging data for driving business decisions through designing and implementing end-to-end data solutions, developing insightful reports and dashboards, and serving as a subject matter expert for data analytics within the organization. - Design and implement data solutions tailored to business needs - Develop interactive dashboards and visual reports using Power BI - Write and optimize complex SQL queries - Utilize Python for data manipulation and advanced analytics - Collaborate with stakeholders to translate business requirements into technical specifications - Ensure data compliance and security standards You will also be responsible for: - Mentoring and supporting junior analysts and engineers - Identifying opportunities for improving data flow, quality, and governance - Participating in special analytics projects and strategic initiatives Proficiency in Power BI, SQL, Python, data modeling, ETL processes, and data warehousing concepts is essential for this role. Strong analytical, communication, and problem-solving skills are also required. This company values innovation, continuous learning, and upskilling opportunities. Joining the team will provide you with the chance to influence business decisions with impactful data solutions and gain exposure to industry-leading tools and best practices. If you are passionate about data analytics and seeking a full-time role, we encourage you to apply for this position.,
ACTIVELY HIRING
posted 3 weeks ago

LEAD DATA ENGINEER

Happiest Minds Technologies
experience8 to 12 Yrs
location
Karnataka
skills
  • Java
  • Spring Boot
  • HDFS
  • Hive
  • Spark
  • Oozie
  • Sqoop
  • GCP
  • Cloud Storage
  • IAM
  • Git
  • containerization
  • data quality
  • cost optimization
  • orchestration
  • RESTful microservices
  • Hadoop ecosystem
  • BigQuery
  • Dataproc
  • Dataflow
  • Composer
  • CICD
  • cloud deployment
  • schema management
  • lineage
  • multicloud
  • hybrid data movement
  • Agile practices
  • CS
Job Description
As a Lead Data Engineer at Manyata Tech Park, your role will involve the following responsibilities: - Strong expertise in Java (8+) & Spring Boot for building RESTful microservices & enterprise integrations. - Hands-on experience with the Hadoop ecosystem including HDFS, Hive/Spark, and Oozie/Sqoop for building production-grade data pipelines. - Proficiency in Google Cloud Platform (GCP) across data services such as BigQuery, Dataproc, Cloud Storage, Dataflow/Composer, & IAM/networking basics. - Experience in CI/CD & automation using Git-based workflows, familiarity with containerization & cloud deployment patterns. Nice to have: - Experience with schema management, data quality, lineage, & cost optimization on cloud data platforms. - Exposure to multi-cloud or hybrid data movement & orchestration. In terms of leadership, you are expected to demonstrate: - Reliability, security, & performance while driving cross-team collaboration. - Proficiency in Agile practices, planning, demos, & managing delivery risks proactively. Qualifications required for this role include: - Bachelor/Master's degree in Computer Science. - 8-12+ years of experience in software engineering with at least 3 years in a lead role delivering complex systems & data pipelines in production environments. Kindly note that the above requirements are essential for excelling in this role as a Lead Data Engineer at Manyata Tech Park.,
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
Karnataka
skills
  • SQL
  • Oracle
  • Postgres
  • Data Mart
  • OLTP
  • Python
  • ETL
  • Google Cloud Platform
  • Spring Boot
  • Microservices
  • Nosql Database
  • DW
  • Data modelling
  • OLAP Systems
  • Load testing methodologies
  • Debugging pipelines
  • Delta load handling
  • Exploratory analysis
  • Apache Beam
  • Google Cloud BigTable
  • Google BigQuery
  • Apache Beam Framework
  • Dataflow
  • PubSub
  • Cloud Run
  • Cloud Function
Job Description
As an experienced IT professional with over 2 years of experience, you should have a good understanding of analytics tools to effectively analyze data. You should also possess the ability to learn new tools and technologies. Your previous work experience should include working with at least one Structural database (such as SQL, Oracle, or Postgres) and one NoSQL database. It is essential to have a strong understanding of Data Warehouse (DW), Data Mart, and Data Modelling concepts. You should have been a part of a Data Warehouse design team in at least one project. Key Responsibilities: - Be aware of design best practices for OLTP and OLAP systems - Have exposure to load testing methodologies, debugging pipelines, and delta load handling - Create Dag files using Python and SQL for ETL processes - Conduct exploratory analysis of logs - Experience with Apache Beam development using Google Cloud BigTable and Google BigQuery is desirable - Familiarity with Google Cloud Platform (GCP) - Ability to write batch and stream processing jobs using Apache Beam Framework (Dataflow) - Experience with Spring Boot - Knowledge of Microservices, Pub/Sub, Cloud Run, and Cloud Function Qualifications Required: - Minimum 2+ years of IT experience - Strong understanding of analytics tools - Proficiency in at least one Structural and one NoSQL database - Knowledge of Data Warehouse, Data Mart, and Data Modelling concepts - Experience in Data Warehouse design team - Familiarity with Python and SQL for ETL processes - Exposure to Apache Beam development and Google Cloud services (Note: Any additional details about the company were not provided in the job description.),
ACTIVELY HIRING
posted 5 days ago
experience1 to 5 Yrs
location
Karnataka
skills
  • Analytical skills
  • Python
  • SQL
  • Problemsolving skills
  • Google SheetsExcel
  • BI tool Zoho AnalyticsPower BITableau
Job Description
As a Junior Data Analyst at GeekyAnts, your role will involve the following responsibilities: - Prepare and update dashboards such as revenue analytics, talent analytics, and monthly MIS reports. - Maintain dashboards to ensure they remain accurate, consistent, and error-free. - Clean, validate, and organize raw data collected from various internal tools and spreadsheets. - Automate recurring reporting tasks using spreadsheet formulas, scripts, or basic coding. - Reconcile figures across multiple data sources and resolve mismatches. - Identify trends, variances, and anomalies in the data for management review. To excel in this role, you should possess the following skills and qualifications: - Strong analytical and problem-solving skills. - Proficiency in Google Sheets/Excel (LOOKUPs, SUMIFS, QUERY, data cleaning). - Basic scripting/coding knowledge for automation (e.g., Python, Apps Script - even beginner level). - Ability to run and modify simple scripts to automate repeated processes. - Hands-on experience with any BI tool, preferably Zoho Analytics (or Power BI/Tableau). - Good understanding of dashboards, data flows, and reporting principles. - Detail-oriented with a focus on accuracy and consistency. - Strong time and task management skills. Preferred qualifications include exposure to automation tools or workflow scripting, familiarity with databases or basic SQL, although these are not mandatory. For educational qualifications, a Bachelor's degree in Data, Finance, Statistics, Computer Science, or a related field is required. GeekyAnts is a design and development studio known for its expertise in cutting-edge technologies like React, React Native, Flutter, Angular, Vue, NodeJS, Python, Svelte, and more. They have a vast client base across various industries and offer services ranging from Web & Mobile Development to UI/UX design, Business Analysis, Product Management, DevOps, QA, API Development, Delivery & Support, and more. Additionally, GeekyAnts has made significant contributions to the open-source community, including the popular UI library NativeBase and other projects. Best of luck with your application at GeekyAnts!,
ACTIVELY HIRING
posted 1 week ago

Data Manager

TERCELHERBS PRIVATE LIMITED
TERCELHERBS PRIVATE LIMITED
experience3 to 8 Yrs
Salary6 - 12 LPA
location
Hyderabad, Chennai+8

Chennai, Bangalore, Noida, Guntakal, Kolkata, Gurugram, Pune, Mumbai City, Delhi

skills
  • management
  • service
  • finance
  • skills
  • project
  • productions
Job Description
We are seeking an experienced Data Manager to lead the development and utilization of data systems. In this role, you will be responsible for identifying efficient methods to organize, store, and analyze data while maintaining strict security and confidentiality measures. An exceptional Data Manager comprehends the intricacies of data management and possesses a deep understanding of databases and data analysis procedures. You should also possess strong technical acumen and exceptional troubleshooting abilities. Your primary objective will be to ensure the seamless and secure flow of information within and outside the organization, guaranteeing timely access and delivery of data. By implementing effective data management practices, you will contribute to the overall success of our organization. Join our team and be a key driver in optimizing our data systems, unlocking valuable insights, and supporting data-driven decision-making processes. Create and enforce policies for effective data managementFormulate techniques for quality data collection to ensure adequacy, accuracy and legitimacy of dataDevise and implement efficient and secure procedures for data handling and analysis with attention to all technical aspectsEstablish rules and procedures for data sharing with upper management, external stakeholders etc.Support others in the daily use of data systems and ensure adherence to legal and company standardsAssist with reports and data extraction when neededMonitor and analyze information and data systems and evaluate their performance to discover ways of enhancing them (new technologies, upgrades etc.)Ensure digital databases and archives are protected from security breaches and data losses Proven experience as data managerExcellent understanding of data administration and management functions (collection, analysis, distribution etc.)Familiarity with modern database and information system technologiesProficient in MS Office (Excel, Access, Word etc.)An analytical mindset with problem-solving skillsExcellent communication and collaboration skillsBSc/BA in computer science or relevant field
posted 2 weeks ago

Data Management Controllers- Analyst

Chase- Candidate Experience page
experience2 to 6 Yrs
location
Karnataka
skills
  • Financial Analysis
  • Data Management
  • Data Quality
  • Data Analytics
  • Continuous Improvement
  • Communication Skills
  • Negotiation Skills
  • Critical Thinking
  • Process Improvement
  • Python
  • Tableau
  • Financial Strategies
  • Analytical Acumen
  • Risk Data Processing
  • Metrics Tracking
  • ProblemSolving Skills
  • Partnership Building
  • Influencing Skills
  • General Ledger Systems
  • Systems
  • Data Flows
  • AIML Skills Alteryx
  • Databricks
Job Description
As a Data Management Controller in our team, you will spend every day defining, refining, and delivering our financial strategies to meet the needs of our global business. Demonstrating strategic thinking and a commitment to best practices, you will leverage your deep understanding of financial consolidation, reporting systems, and analytical acumen to make informed decisions that drive our business forward. You are responsible for monitoring risk data processing from upstream sources and distribution to Consumers (Finance, Risk, Capital, Treasury) is of proper quality via data attendance adherence, data quality (through attribute validations), period-over-period anomaly reviews, appropriate adjustments and real-time data availability communication. Data Management Controllers work in conjunction with Asset Class Controllers (ACC) providing authoritative datasets to consumers. Feedback loops across teams and consumers allow for constant data refinement increasing the quality and authoritativeness of base data. - Monitor and analyze the data quality of the LRI/CRI/FRI data store, ensuring timeliness, completeness, and accuracy of data feeds in collaboration with operational and technology partners. - Review the daily, monthly, and adjustment processes for data received into FRW/CFW/DAC/AWS, coordinating with technology partners to understand issues and discussing enhancements with downstream consumers and Operations teams. - Analyze large volumes of data and perform comprehensive data analytics to identify trends, anomalies, and areas for improvement. - Track and monitor metrics that accurately represent the timeliness, accuracy, and completeness of data flow from source platforms to the LRI/Data Acquisition and Control environment, facilitating prompt understanding, management, and resolution of data quality issues and process exceptions. - Escalate unresolved issues to management or other lines of business when progress stalls or barriers arise, ensuring timely resolution. - Focus on continuous improvement, proactively identifying and implementing enhancements and innovations to address data quality issues and eliminate manual workarounds. - Communicate effectively, both in writing and verbally, when engaging with Risk reporting teams, Operations, Technology, and Operate teams to maintain strong client partnerships and ensure alignment on data quality objectives. - Post graduate/MBA with 2 years financial service Industry experience - Basic understanding of the firm's products - Excellent organizational, problem-solving skills, negotiation and analytical skills - Ability to build and maintain partnerships within the various product aligned businesses and across other corporate groups - Ability to understand business drivers and requirements and influence others to deliver solutions - Ability to critically challenge with the goal of identifying control issues - Ability to quickly understand workings of a complex processing system and general ledger systems across the infrastructure and their interrelationships and dependencies - Aptitude for learning and leveraging systems and data flows - Ability to identify improvement to current processes and achieve efficiencies Preferred qualifications, capabilities, and skills: - Basic experience with a financial consolidation and reporting system - Knowledge of industry standards and regulations - Expertise in AI/ML skills (Alteryx, Python, Tableau, Databricks, etc),
ACTIVELY HIRING
posted 3 weeks ago
experience5 to 9 Yrs
location
Karnataka
skills
  • Data Quality
  • Data Governance
  • GCP
  • Credit Risk
  • Collections
  • Cloud Computing
  • Airflow
  • DBT
  • Alteryx
  • Agile
  • JIRA
  • Confluence
  • Financial Services
  • Retail Banking Operations
  • Data Management
  • Data Visualization
  • Regulatory Compliance
  • Data Transformation
  • Data Analytics
  • Data Profile
  • Data Lineage
  • Data Inventory
  • Data Resiliency
  • Data Quality Standards
  • Data Ingestion
  • Data Governance Forum
  • Fraud Use Cases
  • Data Pipelines
  • Big Query
  • Big Table
  • Dataflow
  • Dataproc
  • Data Quality Tools
  • Dataplex
  • Dashboard Design
Job Description
As an Assistant Director, Data Quality and Governance at HSBC, your role will involve managing internal and external data used for automated decisions within the decisioning ecosystem for Fraud, Credit, and Collections. Your responsibilities will include: - Analyzing and documenting Data profile, Data lineage, Data Inventory, and data quality rules - Defining and implementing data resiliency and data quality standards for multiple data sources and consumption use cases - Managing data quality, resiliency, and governance for data ingested from multiple source systems into GCP - Creating reports to monitor data quality trends and involving the right stakeholders to resolve identified issues - Contributing to running a data governance forum to address critical data issues with required stakeholders - Managing clients from regions with different time zones, regulatory environments, and maturity levels in business, data, and systems In terms of Leadership & Teamwork, you will be expected to: - Assess relationships between recommended analytical solutions/projects and business performance - Assist line manager/supervisor with day-to-day operations and supporting peers when needed - Act as a role model for Group Values and Behaviors (Open, Connected, Dependable) - Support achievement of team objectives and participate in the development of cohesive teams - Contribute to creating a supportive work environment driven by people-centric values - Manage the automation of repetitive programming steps to reduce the need for manual intervention Qualifications for this role include: - Analyzing and documenting requirements with data lineage, transformation rules, and data quality rules for credit risk, collection/recoveries, and fraud use cases - Relevant experience on cloud computing with GCP experience - Experience in working on Data/Information Management projects of varying complexities and hands-on experience on Google Cloud (GCP) implementing data pipelines - Experience with Agile/iterative process in delivering use cases and exposure to JIRA/Confluence for project management activity - Knowledge and understanding of financial services/Retail banking operations across the credit cycle - Experience with data transformation, data visualization, analytics, and translating business requirement documents to detailed data and implementation requirements At HSBC, you will have the opportunity to work in a culture that values and respects all employees, providing continuous professional development, flexible working arrangements, and opportunities for growth within an inclusive and diverse environment. Personal data related to employment applications will be handled in accordance with the Bank's Privacy Statement.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter