data-warehousing-jobs-in-kochi, Kochi

38 Data Warehousing Jobs in Kochi

Toggle to save search
posted 1 week ago

GCP Technical lead

Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
experience6 to 9 Yrs
location
Kochi, Bangalore+3

Bangalore, Chennai, Hyderabad, Pune

skills
  • sql
  • security
  • python
  • gcp
  • devops
  • terraform
  • kubernates
Job Description
Job Title: GCP Technical Lead Employment Type: Permanent Industry of the Employer: IT / Software Services Department / Functional Area: Cloud Engineering, Data Engineering, DevOps Job Description Hiring for Leading MNC GCP Technical Lead Role: GCP Technical Lead Skills: GCP, Python, SQL, BigQuery, Jenkins, Terraform, CI/CD, ETL/ELT Experience: 6-9 Years Locations: Chennai, Kochi, Bangalore, Hyderabad, Pune Eligibility Criteria / Required Skills Strong experience in Python, SQL, Data Warehousing concepts, and Data Modeling Expertise in GCP services: BigQuery, Cloud Run, Pub/Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Functions Hands-on experience with Docker, Kubernetes, GitHub Strong understanding of Microservices and Serverless Architecture Ability to design scalable, secure, and cost-efficient cloud solutions Experience with Infrastructure as Code (IaC) using Terraform Knowledge of Cloud Security principles, IAM, and governance Experience with PySpark and Big Data tools Basic cloud Networking knowledge Google Professional Cloud Architect / DevOps Engineer Certification preferred Familiarity with F&A Domain is an added advantage Excellent communication and leadership skills Role Responsibilities Lead the design and architecture of end-to-end cloud solutions on GCP Oversee development of scalable ETL/ELT pipelines and cloud-native workflows Implement CI/CD pipelines using Jenkins and DevOps best practices Architect microservices and serverless-based applications Drive cloud security, performance tuning, and cost optimization Build and maintain data pipelines using BigQuery, Dataflow, Cloud Storage, Cloud Composer Guide teams through code reviews, best practices, and cloud standards Collaborate with cross-functional teams to ensure architectural alignment Ensure cloud compliance, governance, and secure architecture Keywords / Skills GCP, Python, SQL, Terraform, Jenkins, BigQuery, Cloud Composer, Pub/Sub, CI/CD, ETL, ELT, Microservices, Kubernetes, Docker, IAM, Cloud Security, Dataflow, Serverless, PySpark, Big Data Total Experience: 6 to 9 Years Salary Type: Yearly Annual Salary Offered: As per company norms Job Type: Full Time Shift Type: Day Shift / Rotational (based on project requirement) Location of the Job: Chennai | Kochi | Bangalore | Hyderabad | Pune Why Join Us Opportunity to work on cutting-edge cloud transformation projects. Collaborative and high-growth environment. Exposure to multi-cloud and hybrid cloud technologies. Leadership opportunities in shaping cloud strategy and architecture. If you are passionate about building world-class cloud solutions and want to be part of an innovative team, wed love to hear from you. Apply now!
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 1 week ago

GCP Data Engineer

Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
experience5 to 9 Yrs
Salary10 - 20 LPA
location
Kochi, Bangalore+2

Bangalore, Chennai, Hyderabad

skills
  • devops
  • gcp
  • python
  • etl
  • docker
  • kubernates
  • bigquery
  • terraform
  • dataflow
Job Description
Job Title: GCP Data Engineer Employment Type: Permanent Industry of the Employer: IT / Software Services Department / Functional Area: Data Engineering, Cloud Engineering Job Description GREETINGS FROM HUCON Solutions Hiring for Leading MNC GCP Data Engineer Role: GCP Data Engineer Skills: GCP, Python, SQL, BigQuery, ETL Data Pipeline Experience: 59 Years Locations: Chennai, Kochi, Bangalore, Hyderabad, Pune Eligibility Criteria Strong hands-on experience in Python, Pandas, NumPy Expertise in SQL and solid understanding of Data Warehousing concepts In-depth knowledge of GCP Services including: BigQuery, Cloud Run, Pub/Sub, Cloud Storage, Spanner, Cloud Composer, Dataflow, Cloud Functions Experience with Docker, Kubernetes, GitHub Strong communication and analytical skills Working experience with PySpark, Data Modeling, Dataproc, Terraform Google Cloud Professional Data Engineer certification is highly preferred Role Responsibilities Design, develop, and manage robust and scalable ETL / ELT pipelines on GCP using Dataflow, Cloud Run, BigQuery, Pub/Sub, and Cloud Composer Implement data ingestion workflows for structured and unstructured data sources Build and maintain data warehouses and data lakes using BigQuery, Cloud Storage, and other GCP tools Work with data modeling, architecture, and ETL/ELT frameworks Ensure best practices in performance, optimization, and cloud data management Keywords / Skills GCP, Python, SQL, BigQuery, ETL, Data Pipeline, Data Engineering, Cloud Run, Pub/Sub, Cloud Storage, Dataproc, Terraform, Kubernetes, Docker, Dataflow Total Experience: 5 to 9 Years Salary Type: Yearly Annual Salary Offered: As per company standards Job Type: Full Time Shift Type: Day Shift / Rotational (based on project requirement) Location of the Job: Chennai | Kochi | Bangalore | Hyderabad | Pune
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago
experience5 to 9 Yrs
location
Kochi, Kerala
skills
  • Snowflake
  • Azure
  • ETL
  • Python
  • Data Modeling
  • Power BI
  • Machine Learning
  • Data Governance
  • Data Security
  • Azure Data Factory
  • Azure Synapse Analytics
  • Azure Analysis Services
  • AI Integration
Job Description
Role Overview: As a highly skilled Data Architect, you will be responsible for designing, developing, and maintaining end-to-end data architecture solutions using leading-edge platforms such as Snowflake, Azure, and Azure Data Factory (ADF). Your role will involve translating complex business requirements into scalable, secure, and high-performance data solutions to enable analytics, business intelligence (BI), and machine learning (ML) initiatives. Key Responsibilities: - Design and develop end-to-end data architectures for integration, storage, processing, and analytics utilizing Snowflake and Azure services. - Build scalable, reliable, and high-performing data pipelines using Azure Data Factory (ADF) and Snowflake to handle large volumes of data. - Create and maintain data models optimized for query performance and analytics with Azure Synapse Analytics and Azure Analysis Services (AAS). - Define and implement data governance standards, data quality processes, and security protocols across all data solutions. Cloud Data Platform Management: - Architect and manage data solutions on Azure Cloud, ensuring seamless integration with services like Azure Blob Storage, Azure SQL, and Azure Synapse. - Utilize Snowflake for data warehousing to ensure high availability, scalability, and performance. - Design data lakes and data warehouses using Azure Synapse to create architecture patterns for large-scale data storage and retrieval. Data Integration & ETL Development: - Lead the design and development of ETL/ELT pipelines using Azure Data Factory (ADF) to integrate data from various sources into Snowflake and other Azure-based data stores. - Develop data transformation workflows using Python and ADF to process raw data into analytics-ready formats. - Design and implement efficient ETL strategies using Python, ADF, and Snowflake. Analytics & Business Intelligence (BI): - Design and implement data models for BI and reporting solutions using Azure Analysis Services (AAS) and Power BI. - Create efficient data pipelines and aggregation strategies to support real-time and historical reporting across the organization. - Implement best practices for data modeling to support business decision-making with tools like Power BI, AAS, and Synapse. Advanced Data Solutions (AI/ML Integration): - Collaborate with data scientists and engineers to integrate machine learning (ML) and AI models into data pipeline architecture. - Optimize the data architecture for AI-driven insights and large-scale, real-time analytics. Collaboration & Stakeholder Engagement: - Work with cross-functional teams to understand data requirements and align with business goals. - Provide technical leadership, guiding development teams and ensuring adherence to architectural standards and best practices. - Communicate complex data architecture concepts to non-technical stakeholders, translating business needs into actionable solutions. Performance & Optimization: - Continuously monitor and optimize data solutions for fast, scalable data queries, transformations, and reporting functions. - Troubleshoot and resolve performance bottlenecks in data pipelines and architecture to ensure high availability. - Implement strategies for data archiving, partitioning, and optimization in Snowflake and Azure Synapse environments. Security & Compliance: - Design and implement robust security frameworks to protect sensitive data across Snowflake, Azure Synapse, and other cloud platforms. - Ensure data privacy and compliance with industry regulations (e.g., GDPR, CCPA) through necessary security controls and access policies. Qualification Required: - Proficiency in Snowflake, Azure databricks, and Python.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

Data Architect

Alliance Recruitment Agency
experience8 to 12 Yrs
location
Kochi, Kerala
skills
  • database design
  • data modeling
  • data warehousing
  • Python
  • SQL
  • AWS
  • Azure
  • big data technologies
  • cloud platforms
Job Description
You are responsible for designing and implementing scalable, enterprise-level data systems. You will collaborate with cross-functional teams to define data strategy and architecture, streamline data integration and analytics processes, and ensure compliance with governance and security standards. Key responsibilities include: - Designing and implementing scalable, enterprise-level data systems - Collaborating with cross-functional teams to define data strategy and architecture - Streamlining data integration and analytics processes - Ensuring compliance with governance and security standards Qualifications required: - 8+ years of experience as a Data Architect or similar role - Expertise in database design, data modeling, and data warehousing - Strong Python and SQL skills; familiarity with big data technologies - Hands-on experience with cloud platforms such as AWS or Azure,
ACTIVELY HIRING
posted 2 months ago

Azure Data Engineer

SS Consulting Kochi
experience4 to 8 Yrs
location
Kochi, Kerala
skills
  • Spark
  • Python
  • SQL
  • Data Warehousing
  • Azure Data Engineering
  • Databricks
  • Data Lakes
  • CICD
Job Description
As an Azure Data Engineer, you will leverage your expertise in Azure Data Engineering to work with Databricks, Spark, and handle large volumes of data. Your primary focus will involve data migration, ETL, and data integration processes. It is crucial to have proficiency in Python and SQL, along with a strong understanding of data warehousing and data lakes concepts. Additionally, familiarity with CI/CD processes for data workflows is required. Key Responsibilities: - Utilize Databricks, Spark, and other tools to efficiently handle large volumes of data - Lead data migration, ETL, and data integration processes - Demonstrate proficiency in Python and SQL for data manipulation and analysis - Implement data warehousing and data lakes concepts effectively - Ensure data integrity, accuracy, and consistency through validation processes - Test end-to-end data pipelines for performance, scalability, and fault tolerance Qualifications Required: - 4.5-6 years of experience in Azure Data Engineering - Strong expertise in Databricks, Spark, and handling large volumes of data - Proficiency in Python and SQL - Knowledge of data warehousing and data lakes concepts - Familiarity with CI/CD processes for data workflows,
ACTIVELY HIRING
posted 1 week ago
experience2 to 6 Yrs
location
Kochi, Kerala
skills
  • Data Analysis
  • Data Visualization
  • ETL
  • DAX
  • SQL
  • Database Management
  • Microsoft Power BI
Job Description
As a Microsoft Power BI Developer at our company, your role involves transforming data into actionable insights through interactive and visually appealing dashboards. Your responsibilities include: - Gathering and analyzing business requirements to design and develop Power BI reports and dashboards. - Extracting, transforming, and loading (ETL) data from various sources into Power BI. - Creating data models and performing data analysis to identify trends and insights. - Developing complex DAX queries and calculations to support reporting needs. - Designing, developing, and deploying interactive dashboards and reports that provide actionable insights. - Ensuring data accuracy and integrity in all reports and dashboards. - Optimizing Power BI solutions for performance and usability. - Working closely with business stakeholders to understand their reporting needs and provide effective solutions. - Collaborating with data engineers and IT teams to ensure seamless data integration. - Providing training and support to end-users on Power BI functionalities. - Monitoring and maintaining existing Power BI solutions, making updates and improvements as needed. - Staying up to date with the latest Power BI features and best practices. - Implementing and maintaining data security and governance policies within Power BI. Qualifications: - Education: Bachelor's degree in computer science/Information Technology/Data Science or a related field. - Experience: Proven experience as a Power BI Developer or in a similar role. Strong background in data analysis, business intelligence, and data visualization. Experience with ETL processes and data modeling. - Technical Skills: Proficiency in Microsoft Power BI, including Power BI Desktop, Power BI Service, and Power BI Report Server. Strong understanding of DAX (Data Analysis Expressions) and Power Query (M language). Experience with SQL and database management systems (e.g., SQL Server, Oracle, etc.). Familiarity with data warehousing concepts and tools. If you are passionate about leveraging data to drive insights and meet business objectives, please consider applying for this full-time Power BI Developer position located in Kochi.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • Data warehousing
  • Apache Spark
  • Hadoop
  • Docker
  • Kubernetes
  • Data models
  • Snowflake
  • Data modelling
  • ETL processes
  • Distributed computing principles
  • Containerization technologies
  • Orchestration tools
  • Data schemas
  • RedShift
Job Description
As a Data Engineer at the company, you will play a crucial role in designing, developing, and maintaining scalable and efficient big data processing pipelines for distributed computing systems. You will collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. Your key responsibilities will include: - Implementing data ingestion, processing, and transformation processes to support various analytical and machine learning use cases. - Optimizing and tuning data pipelines for performance, scalability, and reliability. - Monitoring and troubleshooting pipeline performance issues, identifying and resolving bottlenecks. - Ensuring data quality and integrity throughout the pipeline, and implementing data validation and error handling mechanisms. - Staying updated on emerging technologies and best practices in big data processing and analytics, and incorporating them into our data engineering practices. - Documenting design decisions, technical specifications, and data workflows. Qualifications required for this role include: - Expertise in Data modelling, Data warehousing concepts, data governance best practices, and ETL processes. - Understanding of distributed computing principles and experience with distributed data processing frameworks like Apache Spark or Hadoop. - Familiarity with containerization technologies like Docker and Orchestration tools like Kubernetes. - Ability to develop data models, schemas, and structures to support business needs. - Good understanding of a DW platform like Snowflake, RedShift, etc.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • GCP
  • data processing
  • Cloud Storage
  • data engineering
  • data cleansing
  • data enrichment
  • data validation
  • data modeling
  • data warehousing
  • Python
  • SQL
  • Scala
  • programming languages
  • PySpark
  • ETL workflows
  • BigQuery
  • Dataflow
  • data storage solutions
  • data transformations
  • ETL processes
Job Description
Role Overview: You will play a crucial role in designing and implementing efficient data solutions on the Google Cloud Platform (GCP). Your strong data engineering skills, GCP expertise, and proficiency in data processing technologies, particularly PySpark, will be essential in this role. Key Responsibilities: - Design, implement, and optimize end-to-end data pipelines on GCP with a focus on scalability and performance. - Develop and maintain ETL workflows for seamless data processing. - Utilize GCP services such as BigQuery, Cloud Storage, and Dataflow for effective data engineering. - Implement and manage data storage solutions on GCP. - Leverage PySpark for advanced data transformations, ensuring high-quality and well-structured output. - Implement data cleansing, enrichment, and validation processes using PySpark. Qualifications Required: - Proven experience as a Data Engineer, with a strong emphasis on GCP. - Proficiency in GCP services such as BigQuery, Cloud Storage, and Dataflow. - Expertise in PySpark for data processing and analytics is a must. - Experience with data modeling, ETL processes, and data warehousing. - Proficiency in programming languages such as Python, SQL, or Scala for data processing. - Relevant certifications in GCP or data engineering are a plus.,
ACTIVELY HIRING
posted 2 months ago

Warehouse Manager

Olive street food cafe
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • Warehouse Management
  • Inventory Management
  • Order Fulfillment
  • Shipping
  • Team Leadership
  • Process Improvement
  • Logistics
  • Communication
  • Safety Procedures
  • Inventory Software Systems
  • Organizational Skills
  • Health
  • Safety Regulations
Job Description
As a Warehouse Manager, you will play a crucial role in overseeing and coordinating the daily warehousing activities. Your responsibilities will include ensuring efficient warehouse operations, managing inventory, order fulfillment, and leading a team to maximize productivity and accuracy. Key Responsibilities: - Oversee daily warehouse operations including receiving, storage, inventory management, and shipping. - Ensure accurate recording, organization, and maintenance of inventory. - Implement and enforce safety procedures to comply with company standards. - Maintain a clean, organized, and safe warehouse environment. - Monitor and manage warehouse equipment and supplies. - Analyze productivity data to identify and implement process improvements. - Collaborate with other departments to ensure efficient workflow. - Prepare regular reports on warehouse performance and inventory levels. Qualifications: - Proven experience as a Warehouse Manager or similar role. - Strong knowledge of warehouse operations, logistics, and inventory software systems. - Excellent leadership, communication, and organizational skills. - Ability to lift heavy items and operate warehouse equipment. - Understanding of health and safety regulations. - High school diploma or GED required; Bachelor's degree in relevant field is a plus. In addition to the job description, this position offers benefits such as a flexible schedule, leave encashment, and paid time off. The work location for this role is in person.,
ACTIVELY HIRING
posted 2 months ago

Pricing Manager

JFS LOGISTICS
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • analytical skills
  • data analysis
  • market research
  • communication skills
  • negotiation skills
  • Microsoft Excel
  • numerical skills
  • pricing strategy development
  • crossfunctional teams collaboration
  • independent work management
Job Description
As a Pricing Manager at JFS Logistics, your role involves analyzing market trends, developing pricing strategies, ensuring profitability, and coordinating with various departments. Your responsibilities include conducting pricing research, preparing proposals, monitoring competitor pricing, and adjusting prices based on market conditions and business goals. You will collaborate with sales and finance teams to align pricing with overall business objectives. Qualifications: - Strong analytical and numerical skills, with proficiency in data analysis - Experience in pricing strategy development and market research - Excellent communication and negotiation skills - Ability to work collaboratively with cross-functional teams - Proficiency in Microsoft Excel and other relevant software tools - Bachelor's degree in Business, Finance, Economics, or a related field - Experience in the logistics industry is a plus - Ability to work independently and manage multiple tasks At JFS Logistics, we simplify global trade with seamless logistics solutions tailored to business needs. Specializing in international freight forwarding, customs clearance, warehousing, and supply chain management, we aim to connect businesses with the world through fast, secure, and cost-effective services. Trusted by clients in various industries, we have a strong network of partners across GCC, Asia, Europe, and Africa.,
ACTIVELY HIRING
posted 1 month ago

Warehouse Coordinator

Nidi Consultancy
experience1 to 5 Yrs
location
Kochi, Kerala
skills
  • Warehouse Management
  • Operations Management
  • Layout Design
  • Stock Control
  • Budgeting
  • Client Management
  • Supplier Management
  • Transport Management
  • Employee Management
  • Report Generation
  • Excel
  • Pivot Table
  • Policy Enforcement
Job Description
As a Warehouse Manager, you will be responsible for strategically managing the warehouse in compliance with the company's policies and vision. Your role will involve overseeing receiving, warehousing, distribution, and maintenance operations. It will be your responsibility to set up layout and ensure efficient space utilization. You will need to initiate, coordinate, and enforce optimal operational policies and procedures to enhance efficiency. Key Responsibilities: - Adhere to all warehousing, handling, and shipping legislation requirements - Maintain standards of health and safety, hygiene, and security within the warehouse - Manage stock control and reconcile with the data storage system - Prepare the annual budget for warehouse operations - Liaise with clients, suppliers, and transport companies to ensure smooth operations - Plan work rotas, assign tasks appropriately, and appraise results of the team - Recruit, select, orient, coach, and motivate employees to maintain a high-performing team - Produce reports and statistics regularly, such as IN/OUT status report, dead stock report, etc. Qualifications Required: - Must have advanced knowledge in Excel, Pivot table, etc. Additional Company Details: (if available - otherwise omit this section) - Contact: +91 7994413136 - Job Type: Full-time - Benefits: Health insurance - Work Location: In person If you have at least 1 year of total work experience and are willing to reliably commute or relocate to Ernakulam, Kerala, then this role could be the right fit for you.,
ACTIVELY HIRING
posted 2 months ago

Technical Architect

Edstem Technologies
experience5 to 15 Yrs
location
Kochi, Kerala
skills
  • Java
  • Python
  • Spring Boot
  • Django
  • Flask
  • Angular
  • JavaScript
  • Kafka
  • Spark
  • AWS
  • Azure
  • GCP
  • Docker
  • Kubernetes
  • PostgreSQL
  • MySQL
  • MongoDB
  • Redis
  • Elasticsearch
  • System design
  • Distributed systems
  • Scalability
  • Performance
  • Security
  • Leadership
  • Communication
  • Collaboration
  • Data modeling
  • Monitoring tools
  • Nodejs
  • FastAPI
  • React
  • TypeScript
  • ETLELT pipelines
  • CICD
  • InfrastructureasCode
  • Architecture patterns
  • Message queuing systems
  • Eventdriven architectures
  • AgileScrum methodology
  • LLM stacks
  • OpenAI
  • Claude
  • LangChain
  • Vector databases
  • AIML Engineering
  • RAG architectures
  • GraphQL
  • API design patterns
  • Analytics platforms
  • Observability
Job Description
As a Technical Architect, you will play a pivotal role in defining, designing, and overseeing the technical architecture of projects while maintaining strong hands-on coding capabilities. You will work across Frontend, Backend, DevOps, and Data Engineering domains, ensuring best practices, scalability, and efficiency in all solutions. Additionally, you will act as a mentor, guiding engineers at different levels while staying current with emerging technologies including AI/ML and LLM stacks. Key Responsibilities: - Define technical architecture and lead design discussions for scalable, secure, and high-performance applications - Provide hands-on coding and technical expertise across the full technology stack - Design and implement solutions using Java, Python, and Node.js ecosystems - Architect event-driven systems and messaging architectures using Kafka and other messaging platforms - Design and oversee data pipelines and data engineering solutions - Collaborate with stakeholders, product managers, and development teams to translate business needs into technical solutions - Drive adoption of cloud-native, microservices, and modern architecture practices - Implement and oversee CI/CD pipelines, infrastructure automation, and monitoring systems - Perform code reviews, design reviews, and technical audits to ensure quality and adherence to standards - Explore and integrate AI/ML capabilities and LLM-based solutions into existing architectures - Act as a mentor and technical coach for software engineers across levels - Continuously evaluate and recommend new tools, frameworks, and practices - Own technical risk management and ensure alignment with project timelines and goals Required Skills & Experience: - Around 15 years of IT industry experience, with at least 5+ years in an architect role - Strong hands-on coding capability with proven expertise in: - Backend: Java with Spring Boot framework, Python (Django/FastAPI/Flask), Node.js (Express, NestJS) - Frontend: React and Angular, with strong JavaScript/TypeScript fundamentals - Data Engineering: ETL/ELT pipelines, data warehousing, streaming architectures (Kafka, Spark) - DevOps & Cloud: AWS/Azure/GCP, Docker, Kubernetes, CI/CD, Infrastructure-as-Code - Strong knowledge of system design, architecture patterns, and distributed systems - Hands-on experience with databases (PostgreSQL, MySQL, MongoDB, Redis, Elasticsearch) - Experience with message queuing systems and event-driven architectures - Ability to design for scalability, performance, and security - Strong leadership, communication, and collaboration skills - Experience in mentoring and developing engineering teams - Exposure to Agile/Scrum methodology Highly Desired: - Willingness and enthusiasm to learn emerging technologies, particularly: LLM stacks, AI/ML Engineering fundamentals, RAG architectures, and prompt engineering - Experience with GraphQL and modern API design patterns - Knowledge of data modeling and analytics platforms - Familiarity with observability and monitoring tools This role offers the opportunity to shape technical direction while working with cutting-edge technologies and building next-generation solutions.,
ACTIVELY HIRING
posted 1 week ago
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • data warehousing
  • ETL
  • Snowflake
  • Java
  • Python
  • Distributed computing
  • Big Data Engineering
  • Data processing patterns
  • RealTimeStream analytics
Job Description
As a Data and Analytics (D And A) Senior specializing in Snowflake at EY, you will be a crucial part of the team helping clients solve complex business challenges through data and technology. Your role involves leading and architecting the migration of data analytics environments from Teradata to Snowflake, developing and deploying big data pipelines in a cloud environment, and optimizing model codes for faster execution. Your key responsibilities include: - Leading and Architecting migration of data analytics environment from Teradata to Snowflake with a focus on performance and reliability - Developing and deploying big data pipelines in a cloud environment using Snowflake cloud DW - Designing, developing, and migrating existing on-prem ETL routines to Cloud Service - Interacting with senior leaders to understand their business goals and contribute to the delivery of workstreams - Designing and optimizing model codes for faster execution To excel in this role, you should have hands-on experience in data warehousing, ETL, and Snowflake development. You should also be proficient in Snowflake modeling, integrating with third-party tools, and have experience in Snowflake advanced concepts such as setting up resource monitors and performance tuning. Additionally, you should have experience in object-oriented and functional programming styles using Java/Python for Big Data Engineering problems. Collaboration with cross-functional teams and continuous improvement of Snowflake products and marketing are also essential aspects of this role. To qualify for this position, you should be a computer science graduate or equivalent with 3-7 years of industry experience. Having working experience in an Agile-based delivery methodology is preferable. Strong communication skills, technical expertise in Snowflake, and the ability to deploy Snowflake following best practices are also necessary for this role. Ideally, you should have client management skills and a minimum of 5 years of experience as an Architect on Analytics solutions, with around 2 years of experience with Snowflake. EY values individuals with technical experience and enthusiasm to learn in a fast-paced environment. Working at EY offers you the opportunity to work on inspiring and meaningful projects, with a focus on education, coaching, and personal development. You will have the support, coaching, and feedback from engaging colleagues, along with opportunities to develop new skills and progress in your career. EY is dedicated to building a better working world by creating long-term value for clients, people, and society through diverse teams across the globe. Join EY and be part of a team that is committed to building trust in the capital markets and helping clients grow, transform, and operate through data and technology-driven solutions.,
ACTIVELY HIRING
posted 4 weeks ago

Supply Chain Manager

BHA FOODS PRIVATE LIMITED
experience5 to 10 Yrs
Salary6 - 14 LPA
location
Kochi, Ernakulam+8

Ernakulam, Bangalore, Chennai, Hyderabad, Gurugram, Pondicherry, Pune, Chandigarh, Mumbai City

skills
  • warehouse operations
  • distribution
  • demand
  • supply chain management
  • logistics
  • sourcing
  • inventory management
  • management
  • supply
  • planning
  • procurement
  • materials management
Job Description
We are looking for an experienced and organized Supply Chain Manager to manage the complete supply chain process from purchasing raw materials to delivering the final product. The ideal candidate will plan, coordinate, and monitor the movement of goods, ensuring operations run smoothly and efficiently. This role involves working with suppliers, production teams, and logistics partners to achieve business goals and customer satisfaction. Key Responsibilities Manage procurement, production planning, and logistics operations. Build strong relationships with suppliers and negotiate contracts. Monitor inventory levels and reduce waste or shortages. Coordinate with internal departments for smooth operations. Track shipments and ensure on-time delivery to customers. Analyze data to improve supply chain efficiency and reduce costs. Prepare reports and share regular updates with management. Desired Candidate Profile Bachelors degree in Supply Chain Management, Business, or a related field. 5+ years of experience in supply chain or operations management. Good knowledge of logistics, procurement, and inventory management. Strong communication, leadership, and problem-solving skills. Familiarity with ERP systems (SAP / Oracle / NetSuite preferred). Key Skills Supply Chain Management, Procurement, Vendor Management, Logistics, Inventory Control, Planning, Coordination, Forecasting, ERP Systems, Communication, Leadership. About the Company Our company is a growing organization offering exciting opportunities for professionals to build rewarding careers. We value teamwork, innovation, and operational excellence. Join us and be part of a dynamic team driving success through efficient supply chain management.
posted 2 weeks ago

Logistics Executive

SHARMA TRADERS ENTERPRISES
experience20 to >25 Yrs
Salary20 - 32 LPA
WorkContractual
location
Kochi, Bhubaneswar+8

Bhubaneswar, Bangalore, Chennai, Hyderabad, Kolkata, Pune, Mumbai City, Delhi, Guwahati

skills
  • time management
  • aptitude tests
  • communication skills
  • organizational development
  • flexibility training
  • analytical skills
  • interpersonal skills
  • problem-solving
  • resilience
Job Description
A Logistics Executive manages the flow of goods from suppliers to customers, with duties including coordinating transportation, overseeing warehousing, ensuring timely and cost-effective delivery, and managing inventory. Key skills include strong analytical and problem-solving abilities, excellent communication and interpersonal skills, strategic planning, and technological and data analysis aptitude A Logistics Executive manages the flow of goods from suppliers to customers, with duties including coordinating transportation, overseeing warehousing, ensuring timely and cost-effective delivery, and managing inventory. Key skills include strong analytical and problem-solving abilities, excellent communication and interpersonal skills, strategic planning, and technological and data analysis aptitude A Logistics Executive manages the flow of goods from suppliers to customers, with duties including coordinating transportation, overseeing warehousing, ensuring timely and cost-effective delivery, and managing inventory. Key skills include strong analytical and problem-solving abilities, excellent communication and interpersonal skills, strategic planning, and technological and data analysis aptitude A Logistics Executive manages the flow of goods from suppliers to customers, with duties including coordinating transportation, overseeing warehousing, ensuring timely and cost-effective delivery, and managing inventory. Key skills include strong analytical and problem-solving abilities, excellent communication and interpersonal skills, strategic planning, and technological and data analysis aptitude
posted 2 months ago

Snowflake Data Engineer

Viraaj HR Solutions Private Limited
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • data profiling
  • performance tuning
  • data modeling
  • sql
  • data security
  • snowflake
  • azure
  • gcp
  • etl tools
  • data engineering
  • data warehousing
  • data integration
  • sql proficiency
  • data visualization tools
  • automated data workflows
  • cloud services aws
  • data documentation
Job Description
As a Data Engineer at Viraaj HR Solutions, your role will involve the following responsibilities: - Designing and implementing data models in Snowflake for optimized storage and retrieval. - Developing ETL processes to ensure robust data integration pipelines. - Collaborating with data analysts and business stakeholders to gather requirements for data solutions. - Performing data profiling to understand data quality and cleanse data as necessary. - Optimizing query performance using Snowflake-specific features. - Monitoring data warehouse performance and troubleshooting any data-related issues. - Ensuring data security and compliance with company policies and regulations. - Creating and maintaining documentation for data processes and workflows. - Developing automated data workflows to reduce manual processing. - Participating in architecture discussions and contributing to the design of scalable data solutions. - Integrating Snowflake with other cloud services for enhanced data functionality. - Performing regular data backups and disaster recovery exercises. - Staying updated with the latest features and enhancements in Snowflake. - Training junior engineers and advocating for best practices in data engineering. - Contributing to the continuous improvement of data engineering processes. Qualifications required for this role include: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 3 years of experience as a Data Engineer or in a similar role. - Strong proficiency in Snowflake, with hands-on experience in designing data models. - Expertise in SQL and experience with querying large datasets. - Experience with ETL tools and data integration practices. - Familiarity with cloud services such as AWS, Azure, or GCP. - Knowledge of data warehousing concepts and best practices. - Proficiency in performance tuning and optimization techniques. - Excellent problem-solving and analytical skills. - Strong communication skills, both verbal and written. - Ability to work collaboratively in a team environment. - Familiarity with data visualization tools is a plus. - Experience in Agile methodologies is advantageous. - Certifications in Snowflake or data engineering are preferred. - Proactive approach in learning and implementing new technologies. Joining Viraaj HR Solutions as a Data Engineer will provide you with a unique opportunity to work on exciting data projects in a vibrant and innovative environment.,
ACTIVELY HIRING
posted 2 months ago
experience7 to 11 Yrs
location
Kochi, Kerala
skills
  • Tableau
  • Data Analysis
  • Data Visualization
  • SQL
  • Data Warehousing
  • Communication
  • Collaboration
Job Description
Role Overview: As a Senior Tableau Developer/Lead at UST, your primary responsibility will be to design, develop, and maintain Tableau dashboards and reports to provide actionable insights for business decision-making. You will collaborate with cross-functional teams to ensure the effective utilization of Tableau for data analysis. Key Responsibilities: - Develop interactive Tableau dashboards and reports that meet business requirements. - Create visually appealing and user-friendly data visualizations to convey complex information effectively. - Work closely with business analysts to understand data requirements and translate them into Tableau solutions. - Integrate data from various sources into Tableau for comprehensive and accurate reporting. - Optimize Tableau workbooks and dashboards for performance and responsiveness. - Document data sources, data models, and Tableau processes for knowledge transfer and ongoing maintenance. - Provide training and support to end-users on Tableau functionality and best practices. - Collaborate with cross-functional teams to deliver effective Tableau solutions. Qualifications Required: - Minimum 7 years of experience as a Tableau Developer or in a similar role. - Proven experience with a strong portfolio of Tableau dashboards and reports. - Proficiency in data analysis and visualization techniques. - Strong SQL skills for data extraction and manipulation. - Knowledge of data warehousing concepts and best practices. - Excellent communication and collaboration skills. - Ability to work independently and as part of a team. - Tableau Desktop and Server certification is a plus.,
ACTIVELY HIRING
posted 2 months ago

Power BI Developer

Careersteer services private limited
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • Power BI
  • DAX
  • SQL
  • Data Warehousing
  • Power Query
Job Description
Role Overview: As a Power BI Expert, your main responsibility is to design, develop, and maintain end-to-end BI solutions using Power BI. You must have a strong understanding of data structures and relationships, particularly using Power Query and DAX. Your expertise in data modeling and experience in transforming raw data into meaningful insights using Power Query and DAX formulas will be crucial for this role. Key Responsibilities: - Design, develop, and maintain end-to-end BI solutions using Power BI - Utilize Power Query and DAX for data modeling and data transformation - Familiarity with SQL and data warehousing - Configure Gateway connections for data sources behind Private End Points using Power BI Service Qualifications Required: - 3 to 5 years of overall experience in Power BI - Proficiency in Power Query and DAX - Immediate to 30 days notice period (Note: The budget for this position is 12 LPA and it is a full-time job.),
ACTIVELY HIRING
posted 1 month ago

Power BI & SQL Expert (Trainer)

Beat Center of Excellence
experience3 to 7 Yrs
location
Kochi, Kerala
skills
  • Power BI
  • SQL
  • DAX
  • Data Analysis
  • Data Visualization
  • ETL
  • Data Modeling
  • SQL Server
  • Excel
  • Azure
  • APIs
  • Analytical Skills
  • Communication Skills
  • Power Query
  • Problemsolving
  • Documentation Skills
Job Description
You are a skilled Power BI and SQL Expert who will be joining the data team. Your role will involve designing, developing, and maintaining interactive dashboards and reports using Power BI. You will also be responsible for writing, optimizing, and troubleshooting complex SQL queries to extract, manipulate, and analyze data. Collaboration with business stakeholders to gather requirements and transform them into technical specifications will be a crucial part of your responsibilities. Additionally, you will need to perform data validation to ensure data accuracy across reports, build and maintain data models and ETL pipelines, monitor data refresh schedules, and provide insights and recommendations through data visualizations and analysis. Documenting report logic, KPIs, and data model structures will also fall under your purview. Key Responsibilities: - Design, develop, and maintain interactive dashboards and reports using Power BI - Write, optimize, and troubleshoot complex SQL queries - Collaborate with business stakeholders to gather requirements - Perform data validation and ensure data accuracy - Build and maintain data models and ETL pipelines - Monitor data refresh schedules and resolve data load issues - Provide insights and recommendations through data visualizations - Document report logic, KPIs, and data model structures Qualifications Required: - Bachelor's degree in Computer Science, Information Systems, or a related field - 3+ years of hands-on experience with Power BI and SQL - Proficiency in DAX and Power Query (M Language) - Strong understanding of relational databases and data warehousing concepts - Experience with data sources like SQL Server, Excel, Azure, or APIs - Ability to translate business requirements into impactful visuals - Strong problem-solving and analytical skills - Excellent communication and documentation skills The company also prefers candidates with experience with cloud platforms like Azure, AWS, or Google Cloud, knowledge of Python or R for data analysis (optional), and experience with other BI tools like Tableau or Qlik. Please note that the job types available are Full-time, Part-time, Permanent, Contractual/Temporary, Freelance, and the work schedule may include Day shift, Evening shift, Fixed shift, Monday to Friday, Morning shift, or Night shift. The work location is in person and proficiency in English is required.,
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
Kochi, Kerala
skills
  • optimization
  • Transformers
  • communication skills
  • Python programming
  • ML tooling
  • PyTorch
  • TensorFlow
  • classical ML DL
  • supervisedunsupervised learning
  • CNNs
  • RNNsLSTMs
  • finetuning open models
  • PEFT approaches
  • model quantization
  • cloud ML environment
  • multimodal training pipelines
  • MLOps concepts
  • distributedefficiency libraries
  • LangGraph Autogen CrewAI
  • BigQuery Synapse
  • AI safety
  • responsibleAI best practices
Job Description
As an AI engineer at our company located in Kochi, you will be responsible for designing, prototyping, and delivering generative AI capabilities. Your role will involve practical research, building POCs, fine-tuning open models, contributing to multimodal experiments, and helping take solutions towards production while rapidly learning both modern and classical ML techniques. Your core responsibilities will include: - Building and evaluating prototypes / POCs for generative AI features and ideas. - Fine-tuning and adapting open-source LLMs and smaller generative models for targeted use cases. - Collaborating on multimodal experiments (text image audio) and implementing training/evaluation pipelines. - Implementing data preprocessing, augmentation, and basic feature engineering for model inputs. - Running experiments, designing evaluation metrics, performing ablations, logging results, and iterating on models. - Optimizing inference and memory footprint for models through quantization, batching, and basic distillation. - Contributing to model training pipelines, scripting, and reproducible experiments. - Working with cross-functional teams to prepare prototypes for deployment. - Writing clear documentation, presenting technical results to the team, participating in code reviews, and sharing knowledge. - Continuously learning by reading papers, trying new tools, and bringing fresh ideas into projects. Mandatory technical skills required for this role include: - Strong Python programming skills and familiarity with ML tooling such as numpy, pandas, and scikit-learn. - Hands-on experience (2+ years) with PyTorch and/or TensorFlow for model development and fine-tuning. - Solid grounding in classical ML & DL, including supervised/unsupervised learning, optimization, CNNs, RNNs/LSTMs, and Transformers. - Good understanding of algorithms & data structures, numerical stability, and computational complexity. - Practical experience fine-tuning open models like Hugging Face Transformers, LLaMA family, BLOOM, Mistral, or similar. - Familiarity with PEFT approaches and simple efficiency techniques. - Exposure to at least one cloud ML environment like GCP Vertex AI, AWS SageMaker, or Azure AI. - Good communication skills for documenting experiments and collaborating with product/infra teams. Desirable / preferred skills for this role include: - Experience with multimodal training pipelines or cross-modal loss functions. - Familiarity with MLOps concepts and tools like DeepSpeed, Accelerate, Ray, or similar. - Knowledge of LangGraph, Autogen, CrewAI, or interest in agentic systems. - Experience with BigQuery, Synapse, or data warehousing for analytics. - Awareness of AI safety and responsible-AI best practices. In this role, you will have rapid exposure to state-of-the-art generative AI and real production use cases, offering a mix of research mindset and product delivery. You will also receive mentor support from senior researchers and opportunities to publish or demo prototypes.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter