sql-clustering-jobs-in-hyderabad, Hyderabad

16 Sql Clustering Jobs in Hyderabad

Toggle to save search
posted 2 months ago
experience7 to 11 Yrs
location
Hyderabad
skills
  • Data analysis
Job Description
About the Role We are seeking a highly skilled Senior Data Engineer with deep expertise in Snowflake and strong proficiency in SQL to join our Data Insights Engineering team The ideal candidate should have handson experience designing and building scalable data pipelines optimizing data warehouses and enabling advanced analytics This role requires strong data engineering fundamentals experience in Pythonbased ETLELT development and working knowledge of BI tools like Tableau to support business stakeholders Key Responsibilities Design develop and maintain data pipelines and workflows on Snowflake Write highly optimized SQL queries transformations and stored procedures for complex data engineering use cases Architect and implement data models schemas and warehouse structures that support analytics and reporting Integrate data from multiple sources databases APIs files streaming into Snowflake using PythonETL frameworks Optimize Snowflake performance warehouses caching clustering partitioning cost control Partner with analysts and business teams to deliver clean reliable datasets for Tableau and other reporting tools Implement data quality checks monitoring and ing for critical pipelines Collaborate with crossfunctional teams Data Scientists Analysts Product Engineering Contribute to best practices in data governance documentation and security Required Skills Experience 7 years of handson Data Engineering experience Strong expertise in Snowflake warehouses roles RBAC tasks streams procedures Snowpark a plus Expertlevel SQL query optimization advanced analytics functions performance tuning Proficiency in Python for ETL ELT scripting and automation Experience with Tableau for dashboarding and data validation Solid knowledge of data modeling data warehousing concepts and ELT pipelines Strong understanding of cloud data platforms AWS preferred Familiarity with CICD Git Docker and orchestration tools Airflow dbt etc
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago
experience8 to 12 Yrs
location
Hyderabad, Telangana
skills
  • Oracle SOA
  • Oracle BPEL
  • OSB
  • integration architecture
  • SQL
  • XML
  • XSD
  • Webservices
  • XPATH
  • XQUERY
  • Clustering
  • performance tuning
  • Fusion Middleware applications
  • Oracle Integration Cloud OIC
  • JavaJ2EE
  • Oracle databases
  • WSDLs
  • SOAPREST web services
  • SOA Suite technical Adapters
  • JMS QueuesTopics
  • software development lifecycle
  • SOA design patterns
  • SOA Architecture
  • SOA administration
  • File Adapters
  • DB Adapters
  • JMS Adapters
  • SOA Database administration
  • purging
Job Description
As a candidate for the position, you will be responsible for designing and implementing solutions for Utilities, Healthcare, and Hospitality industries. Your role will involve the following key responsibilities: - Possessing at least 8 years of end-to-end implementation experience in Fusion Middleware applications, including Oracle SOA, Oracle BPEL, and OSB - Having a minimum of 2 years of experience in Oracle Integration Cloud (OIC) - Demonstrating expertise in integration architecture, design & development strategies, and providing solutions to customers - Serving as a Subject Matter Expert in Webservices, XML, Java/J2EE, Oracle databases, SQL, WSDLs, XML, XSD, and other integration standards - Possessing a strong background in XML, XSD, XSL, WSDL, XPATH, XQUERY, etc. - Developing SOAP/REST web services using SOA Suite technical Adapters - Creating services for producing/consuming messages to/from JMS Queues/Topics - Developing services for reading or writing files - Having hands-on experience with the entire software development lifecycle, including requirements, design, implementation, integration, and testing - Demonstrating in-depth knowledge and experience in SOA design patterns and SOA Architecture - Possessing good knowledge in SOA administration, Clustering, and performance tuning - Hands-on experience in processing high volumes of transactions using File, DB, and JMS Adapters - Good knowledge in SOA Database administration and purging Qualifications required for this role include: - Minimum of 8 years of experience in end-to-end implementation of Fusion Middleware applications - At least 2 years of experience in Oracle Integration Cloud (OIC) - Strong skills in integration architecture, design & development strategies - Expertise in Webservices, XML, Java/J2EE, Oracle databases, SQL, WSDLs, XML, XSD, and other integration standards - Proficiency in XML, XSD, XSL, WSDL, XPATH, XQUERY, etc. - Hands-on experience with SOA design patterns and SOA Architecture - Knowledge in SOA administration, Clustering, and performance tuning - Experience in processing high volumes of transactions using File, DB, and JMS Adapters - Understanding of SOA Database administration and purging,
ACTIVELY HIRING
posted 3 weeks ago
experience9 to 13 Yrs
location
Hyderabad, Telangana
skills
  • Python
  • R
  • SQL
  • Power BI
  • Tableau
  • AWS
  • Azure
  • GCP
Job Description
Role Overview As a Manager / Sr. Manager CPG Analytics, you will lead end-to-end solutioning and delivery of analytics programs for global consumer goods clients. This role requires a unique blend of domain expertise, quantitative acumen, project leadership, and stakeholder engagement. You will serve as a trusted advisor to clients, translating commercial challenges into scalable analytical solutions while managing delivery excellence across multifunctional teams. Key Responsibilities - Act as a strategic analytics advisor across Sales, Marketing, Trade, and Commercial functions within the CPG space. - Translate client business goals into structured analytics programs across trade promotion optimization, pricing strategy, and revenue growth initiatives. - Provide industry best practices and benchmarks to enhance client decision-making and maturity. - Lead analytics engagements focused on: - Trade Promotion Effectiveness (TPE) and ROI analytics - Price-pack architecture and elasticity modelling - Revenue Growth Management (RGM) levers - Route-to-market and channel performance analytics - Market Mix Modelling - Apply advanced statistical and machine learning techniques using tools like Python/R, SQL, and Power BI/Tableau. - Guide solution frameworks, models, and visualizations to ensure actionable outcomes for business users. - Drive end-to-end program execution: scoping, planning, stakeholder alignment, delivery oversight, and impact measurement. - Manage cross-functional teams across data science, engineering, visualization, and domain consulting. - Ensure consistent delivery quality, governance, and client communication across regions and markets. - Identify whitespace opportunities and new use cases within existing accounts to grow the CPG analytics footprint. - Partner with Pre-Sales, Solutions, and BD teams in proposal development and PoC planning. - Contribute to the development of accelerators, frameworks, and reusable assets for analytics delivery. Qualifications & Experience Must-Have - 9-12 years of experience in data & analytics roles with a strong focus on the Consumer Packaged Goods (CPG) industry. - Deep functional expertise in trade, pricing, promotions, and revenue growth strategies. - Proven experience leading analytics engagements with demonstrable business impact. - Proficiency with tools like Python/R, SQL, Power BI/Tableau, and exposure to cloud-based data platforms (AWS, Azure, GCP). - Strong project management, stakeholder engagement, and executive presentation skills. Preferred - Experience with syndicated CPG POS data (Nielsen, IRI) and Marketing data. - Familiarity with advanced modeling techniques: regression, time series, clustering, optimization. - Exposure to global markets and multicultural client engagements. Company Details (if available) The company, Straive, offers you the opportunity to: - Drive high-impact analytics transformations for leading global CPG brands. - Be part of a fast-growing, innovation-led analytics consulting practice. - Work alongside top-tier talent across AI, data science, and business domains. - Thrive in a flexible, entrepreneurial, and high-ownership work environment.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago
experience8 to 12 Yrs
location
Hyderabad, Telangana
skills
  • Snowflake
  • Data Engineering
  • ETL
  • SQL
  • Data Modeling
  • Apache Spark
  • Airflow
  • dbt
  • Git
  • PySpark
  • AWS Glue
  • CICD
  • Data Lakes
Job Description
Role Overview: You will be joining as a Principal Snowflake Data Engineer & Data Engineering Lead at Health Catalyst, a fast-growing company dedicated to solving national-level healthcare problems. In this role, you will play a crucial part in leading and mentoring cross-functional teams focused on developing innovative tools to support the mission of improving healthcare performance, cost, and quality. Your responsibilities will include owning the architectural vision and implementation strategy for Snowflake-based data platforms, leading the design, optimization, and maintenance of ELT pipelines, and driving best practices in schema design and data modeling. Key Responsibilities: - Own the architectural vision and implementation strategy for Snowflake-based data platforms. - Lead the design, optimization, and maintenance of ELT pipelines and data lake integrations with Snowflake. - Drive Snowflake performance tuning, warehouse sizing, clustering design, and cost governance. - Leverage Snowflake-native features like Streams, Tasks, Time Travel, Snowpipe, and Materialized Views for real-time and batch workloads. - Establish robust data governance, security policies, and regulatory compliance within Snowflake. - Ensure best practices in schema design, data modeling, and version-controlled pipeline development. Qualification Required: - 8-10 years of data engineering experience with at least 4-5 years of hands-on Snowflake expertise. - Proven leadership of cross-functional data teams. - Deep expertise in Snowflake internals, data governance, SQL, and data modeling. - Hands-on experience with Apache Spark, PySpark, AWS Glue, orchestration frameworks, and CI/CD workflows. Additional Details: Health Catalyst is a dynamic and influential company in the healthcare industry, offering opportunities to work on solving national-level healthcare problems. The company values smart, hardworking, and humble individuals, providing a fast-paced and innovative work environment focused on improving the lives of millions of people.,
ACTIVELY HIRING
posted 1 month ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Python
  • Tableau
  • Power BI
  • Microsoft Excel
  • PowerPoint
  • Looker
  • Jupyter Notebook
  • GCP Google Cloud Platform
Job Description
As a senior business intelligence analyst with over 5 years of experience, you will be responsible for leading and managing a team of data analysts in designing, analyzing, measuring, and deploying multiple campaigns. Your role will involve developing and enhancing reports to summarize key business metrics, providing strategic insights, and automating processes to improve efficiency. Additionally, you will conduct SQL training sessions, mentor team members, create frameworks and statistical models, and develop dashboards and data visualizations to track and analyze key performance indicators (KPIs). Collaboration with stakeholders to understand business needs, managing vendor relationships, and developing tools to streamline vendor management processes are also key aspects of this role. Key Responsibilities: - Lead and manage a team of data analysts for campaign design, analysis, measurement, and deployment. - Develop and enhance reports summarizing key business metrics and providing strategic insights. - Automate processes to improve efficiency and reduce manual intervention. - Conduct SQL training sessions and mentor team members to enhance technical and analytical skills. - Create frameworks and statistical models to support business decisions and campaign rollouts. - Develop and maintain dashboards and data visualizations for tracking and analyzing KPIs. - Collaborate with stakeholders to understand business needs and deliver actionable insights. - Manage vendor relationships and develop tools for streamlining vendor management processes. Qualifications: - Experience: 5 to 8 years. Required Skills: - Proficiency in SQL, Python, Tableau, Power BI, Looker, Jupyter Notebook, GCP (Google Cloud Platform), Microsoft Excel, and PowerPoint. - Strong understanding of statistical concepts and techniques like hypothesis testing, A/B testing, regression analysis, and clustering. - Experience with data visualization and reporting tools. - Excellent project management and business communication skills. - Ability to lead and mentor a team, fostering a collaborative and growth-oriented environment.,
ACTIVELY HIRING
posted 1 week ago
experience7 to 12 Yrs
location
Hyderabad, Telangana
skills
  • Python
  • NumPy
  • AWS
  • Azure
  • GCP
  • SQL
  • Git
  • Pandas
  • Scikitlearn
  • TensorFlow
  • PyTorch
  • SageMaker
  • Azure ML
  • Vertex AI
  • CICD
Job Description
Role Overview: You are seeking a highly experienced and innovative Senior AI Engineer to join the rapidly expanding AI/ML team at CGI. With 7+ years of hands-on experience, you will play a crucial role in the end-to-end lifecycle of AI/ML solutions, from research and prototyping to developing, deploying, and maintaining production-grade intelligent systems. By collaborating closely with data scientists, product managers, and software engineers, you will drive the adoption of cutting-edge AI technologies across platforms. Key Responsibilities: - Design, develop, train, and optimize robust and scalable machine learning models (e.g., deep learning, classical ML algorithms) for various applications. - Build and maintain end-to-end MLOps pipelines for model deployment, monitoring, versioning, and retraining, ensuring reliability and performance in production environments. - Work with large, complex datasets, perform data cleaning, feature engineering, and data pipeline development to prepare data for model training and inference. - Explore and evaluate new AI/ML technologies, algorithms, and research papers to identify opportunities for innovation and competitive advantage. Rapidly prototype and test new ideas. - Optimize AI/ML models and inference systems for speed, efficiency, and resource utilization. - Partner closely with Data Scientists and Software Engineers to transition research prototypes into production-ready solutions, integrate AI models into existing products and platforms, and communicate complex AI concepts to technical and non-technical stakeholders. - Write clean, maintainable, and well-documented code, implement software engineering best practices within the AI/ML lifecycle, and implement robust monitoring for model performance, data drift, and system health in production. Qualifications Required: Must-Have Skills: - 7+ years of hands-on experience as an AI Engineer, Machine Learning Engineer, or a similar role focused on building and deploying AI/ML solutions. - Strong proficiency in Python and its relevant ML/data science libraries (e.g., NumPy, Pandas, Scikit-learn, TensorFlow, PyTorch). - Extensive experience with at least one major deep learning framework such as TensorFlow, PyTorch, or Keras. - Solid understanding of machine learning principles, algorithms (e.g., regression, classification, clustering, ensemble methods), and statistical modeling. - Experience with cloud platforms (e.g., AWS, Azure, GCP) and their AI/ML services (e.g., SageMaker, Azure ML, Vertex AI). - Proven experience with MLOps concepts and tools for model lifecycle management (e.g., MLflow, Kubeflow, Sagemaker MLOps, Azure ML Pipelines). - Strong SQL skills for data manipulation, analysis, and feature extraction from relational databases. - Experience with data preprocessing, feature engineering, and handling large datasets. Good-to-Have Skills: - Familiarity with software development best practices including version control (Git), code reviews, and CI/CD.,
ACTIVELY HIRING
posted 2 months ago

Snowflake SME

SuperIntelligence Technologies
experience10 to 14 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Data security
  • Data modeling
  • Snowflake data warehouse platform
Job Description
As a Snowflake Subject Matter Expert (SME), you will play a vital role in architecting, developing, and optimizing data warehouse and lakehouse solutions using the Snowflake platform. Your expertise will be essential in enabling the Data team to construct a modern, scalable data platform. You will collaborate closely with data architects, engineers, and business teams to ensure that data ingestion, storage, and access solutions on Snowflake meet performance, security, and compliance requirements. Your Key Responsibilities: - Design and implement Snowflake data warehouse and lakehouse architectures. - Develop and optimize Snowflake SQL queries, stored procedures, and data transformation logic. - Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls. - Integrate Snowflake with other data platform components, including Databricks, Data Factory, and cloud storage. - Implement best practices for data partitioning, clustering, and caching to optimize performance and cost-efficiency. - Participate in data ingestion automation, metadata management, and monitoring within Snowflake environments. - Collaborate with security teams to enforce data governance, encryption, and compliance policies in Snowflake. - Support CI/CD and automation of Snowflake deployment and pipeline orchestration. - Provide knowledge transfer and technical guidance to team members and clients. Skills And Attributes For Success: - Deep expertise in Snowflake data warehouse platform, including performance tuning and cost optimization. - Strong SQL skills and experience with Snowflake-specific features such as Time Travel, Zero Copy Cloning, Streams, and Tasks. - Understanding of data security, role-based access control (RBAC), and compliance in Snowflake. - Knowledge of data modeling principles for enterprise data warehouses. Qualifications Required: - 10+ years of experience in a related field. - Proven experience in architecting and optimizing data warehouse solutions using Snowflake. - Strong expertise in SQL and Snowflake-specific features. - Familiarity with data security, access controls, and compliance practices within Snowflake.,
ACTIVELY HIRING
posted 1 month ago

Snowflake Administrator

Milestone Technologies, Inc.
experience4 to 8 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Data migration
  • Data security
  • Compliance
  • Access control
  • Snowflake administration
  • Backup
  • recovery
  • AWSAzureGCP integration
Job Description
As a Snowflake Administrator at Milestone Technologies, your role will involve various responsibilities to ensure efficient management and optimization of Snowflake cloud data warehouse platform. Here is a summary of your key role overview, responsibilities, and required qualifications: **Role Overview:** You will be joining Milestone Technologies as a Snowflake Administrator, where you will play a crucial role in managing and optimizing the Snowflake cloud data warehouse platform to support the organization's IT development team. **Key Responsibilities:** - **Security & Access Management:** Implement RBAC, manage users/roles, control object-level access, and maintain network/security policies. - **Resource Management:** Create and manage warehouses, databases, and schemas; optimize compute usage and enable secure data sharing. - **Performance Optimization:** Monitor query and warehouse performance, resolve bottlenecks, and implement clustering, caching, and auto-suspend/resume. - **Cost & Usage Governance:** Track compute/storage usage, manage warehouse sizes, and apply resource monitors for cost control. - **Data Governance & Compliance:** Enforce data security, masking, and access policies to meet compliance requirements (HIPAA, GDPR, SOC2). - **Automation & Integration:** Automate Snowflake tasks and integrate with CI/CD, dbt, Terraform, and monitoring tools (e.g., Airflow, DataDog). - **Collaboration & Documentation:** Maintain platform documentation, best practices, and support cross-functional data teams for operational excellence. **Qualifications Required:** - 4+ years of experience managing Snowflake or similar cloud data warehouse platforms. - Strong expertise in Snowflake administration roles, users, warehouses, and performance tuning. - Solid SQL skills with experience in data migration, backup, and recovery. - Exposure to data replication, failover, and large-scale production workloads. - Familiarity with AWS/Azure/GCP integration (IAM, S3, networking). - Hands-on experience in data security, compliance, and access control. You will be a valuable addition to Milestone Technologies, contributing to the company's mission of revolutionizing the way IT is deployed. Your expertise and skills will play a significant role in driving specific business outcomes such as digital transformation, innovation, and operational agility for our clients.,
ACTIVELY HIRING
posted 1 month ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Performance tuning
  • Cost optimization
  • Data security
  • RBAC
  • Compliance
  • Data modeling
  • Snowflake data warehouse platform
Job Description
As a Snowflake Subject Matter Expert (SME), your role involves architecting, developing, and optimizing data warehouse and lakehouse solutions using the Snowflake platform. Your expertise will play a crucial role in enabling the Data team to build a modern, scalable data platform. You will collaborate closely with data architects, engineers, and business teams to ensure data ingestion, storage, and access solutions on Snowflake meet performance, security, and compliance requirements. **Key Responsibilities:** - Design and implement Snowflake data warehouse and lakehouse architectures. - Develop and optimize Snowflake SQL queries, stored procedures, and data transformation logic. - Manage Snowflake account configurations, resource monitors, virtual warehouses, and access controls. - Integrate Snowflake with other data platform components, including Databricks, Data Factory, and cloud storage. - Implement best practices for data partitioning, clustering, and caching to optimize performance and cost-efficiency. - Participate in data ingestion automation, metadata management, and monitoring within Snowflake environments. - Collaborate with security teams to enforce data governance, encryption, and compliance policies in Snowflake. - Support CI/CD and automation of Snowflake deployment and pipeline orchestration. - Provide knowledge transfer and technical guidance to team members and clients. **Skills And Attributes for Success:** - Deep expertise in Snowflake data warehouse platform, including performance tuning and cost optimization. - Strong SQL skills and experience with Snowflake-specific features such as Time Travel, Zero Copy Cloning, Streams, and Tasks. - Understanding of data security, role-based access control (RBAC), and compliance in Snowflake. - Knowledge of data modeling principles for enterprise data warehouses.,
ACTIVELY HIRING
posted 2 days ago
experience6 to 10 Yrs
location
Hyderabad, Telangana
skills
  • Java
  • Oracle ADF
  • MySQL
  • Jasper Reports
  • JBoss
  • Maven
  • Ant
  • XML
  • HTML
  • CSS
  • JavaScript
  • REST
  • SOAP
  • Git
  • SVN
  • MVC
  • SQL
  • Database Administration
  • Database Optimization
  • Database Backup
  • Soft Skills
  • ADF Business Components
  • ADF Faces
  • ADF Task Flows
  • Tomcat Server
  • JDeveloper IDE
  • JasperReports Server Administration
Job Description
As a Java & Oracle ADF Developer with 6-8 years of experience, your role involves designing, developing, and maintaining enterprise applications using Java & Oracle's ADF technology stack and related technologies. **Key Responsibilities:** - Design and develop enterprise applications using Java & Oracle ADF framework and related technologies - Create and maintain ADF Business Components such as Entity Objects, View Objects, and Application Modules - Develop user interfaces using ADF Faces components and ADF Task Flows - Implement business logic and data validation rules using ADF BC - Design and develop reports using Jasper Reports - Configure and maintain application servers like Tomcat and JBoss - Integrate applications with MySQL databases and web services - Handle build and deployment processes - Perform code reviews and ensure adherence to coding standards - Debug and resolve production issues - Collaborate with cross-functional teams including business analysts, QA, and other developers - Provide technical documentation and maintain project documentation - Participate in all phases of the software development lifecycle **Qualifications Required:** - Strong expertise in Oracle ADF framework with 6-8 years of hands-on experience - Proficiency in Java/J2EE technologies - Advanced knowledge of ADF Business Components including Entity Objects, View Objects, and Application Modules - Strong experience with ADF Faces Rich Client components - Expertise in ADF Task Flows (bounded and unbounded) - Proficiency in MySQL database design, optimization, and query writing - Strong experience with Jasper Reports for report development and customization - Experience in deploying and maintaining applications on Tomcat Server - Experience with JBoss/WildFly application server configuration and deployment - Expertise in build tools like Maven/Ant and build automation - Experience with continuous integration and deployment processes - Knowledge of application server clustering and load balancing The job also requires knowledge of XML, HTML, CSS, and JavaScript, experience with JDeveloper IDE, understanding of REST/SOAP web services, version control systems like Git/SVN, MVC architecture patterns, performance tuning and optimization, MySQL database administration, writing complex SQL queries and stored procedures, database optimization and performance tuning, database backup and recovery procedures, Jasper Reports design and development, creating complex reports with sub-reports, knowledge of JasperReports Server administration, ability to integrate reports with web applications, strong analytical and problem-solving abilities, excellent communication and team collaboration skills, ability to work independently and manage multiple priorities, good documentation skills, and attention to detail with a quality-oriented mindset.,
ACTIVELY HIRING
posted 1 day ago
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • MongoDB
  • Oracle
  • SQL Server
  • MySQL
  • PostgreSQL
  • Redis
  • PLSQL
  • TSQL
  • Aerospike 80
Job Description
As a highly skilled and hands-on Database Lead/Architect, you will be responsible for designing and implementing database architectures for high-concurrency, mission-critical applications. Your expertise across relational, NoSQL, and in-memory databases, particularly with Aerospike 8.0 and/or MongoDB, combined with traditional RDBMS knowledge is essential for this role. You must be capable of designing enterprise-scale systems from scratch, implementing advanced business logic, and ensuring performance, scalability, and high availability. **Key Responsibilities:** - **Architecture & System Design** - Design end-to-end database architectures for high-concurrency applications. - Architect hybrid solutions combining RDBMS, NoSQL (Aerospike 8.0, MongoDB), and Redis. - Implement partitioning, sharding, replication, clustering, and geo-distributed strategies. - Leverage in-memory indexes and advanced Aerospike 8.0 features. - Apply MongoDB advanced concepts. - **Database Development & Business Logic** - Develop complex business logic using SQL and NoSQL frameworks. - Write and optimize UDFs, stored procedures, and advanced query pipelines. - Standardize best practices for schema design, indexing, and query optimization. - **Performance & Scalability** - Conduct deep performance tuning at query, schema, and cluster levels. - Implement in-memory caching strategies. - Optimize database access patterns for low-latency, high-throughput workloads. - Monitor, analyze, and resolve performance bottlenecks. - **Leadership & Collaboration** - Act as the technical authority for database solutions. - Lead design reviews, mentor developers, and enforce coding standards. - Collaborate with application architects, data engineers, and DevOps teams. - Evaluate emerging database technologies and recommend adoption. **Qualifications Required:** - RDBMS Expertise: Oracle PL/SQL, MS SQL Server (T-SQL), MySQL, PostgreSQL. - NoSQL Expertise: Aerospike 8.0, MongoDB, Redis. - Strong proficiency in partitioning, sharding, clustering, replication, and HA/DR strategies. - Proven track record of designing database systems for large-scale, real-time applications. - Deep knowledge of performance tuning, query optimization, and indexing strategies. The company prefers candidates with exposure to cloud-native database services, streaming platforms, big data ecosystems, and microservices database integration patterns. An advanced degree in Computer Science, Information Systems, or related field is also a plus.,
ACTIVELY HIRING
posted 1 day ago

Snowflake Architect

RENOVISION AUTOMATION SERVICES PVT.LTD
experience12 to 16 Yrs
location
Hyderabad, Telangana
skills
  • Snowflake
  • Data Warehousing
  • Data Engineering
  • SQL
  • Python
  • Data Modeling
  • DBT
  • Airflow
  • SSIS
  • Cloud Architecture
  • Snowflake Implementations
  • DataOps
  • MLOps
  • CICD Pipelines
  • IICS
Job Description
As an experienced Snowflake Architect with over 12 years of expertise in data warehousing, cloud architecture, and Snowflake implementations, your role revolves around designing, optimizing, and managing large-scale Snowflake data platforms to ensure scalability, performance, and security. You are expected to possess deep technical knowledge of Snowflake, cloud ecosystems, and data engineering best practices. **Key Responsibilities:** - Lead the design and implementation of Snowflake data warehouses, data lakes, and data marts. - Define best practices for Snowflake schema design, clustering, partitioning, and optimization. - Architect multi-cloud Snowflake deployments with seamless integration and design data sharing, replication, and failover strategies for high availability. - Optimize query performance using Snowflake features, implement automated scaling strategies for dynamic workloads, and troubleshoot performance bottlenecks in large-scale Snowflake environments. - Architect ETL/ELT pipelines using Snowflake, Coalesce, and other tools, integrate Snowflake with BI tools, ML platforms, and APIs, and implement CDC, streaming, and batch processing solutions. **Qualifications Required:** - Over 12 years of experience in data warehousing, cloud architecture, and database technologies. - 8+ years of hands-on Snowflake architecture and administration experience. - Expertise in SQL and Python for data processing. - Deep knowledge of Snowflake features, experience with cloud platforms, and strong understanding of data modeling. - Certification as a Snowflake Advanced Architect is a must. In terms of security, governance, and compliance, your responsibilities include defining RBAC, data masking, row-level security, and encryption policies in Snowflake. You will ensure compliance with GDPR, CCPA, HIPAA, and SOC2 regulations and establish data lineage, cataloging, and auditing using Snowflake's governance features. As a leader in this role, you will mentor data engineers, analysts, and developers on Snowflake best practices, collaborate with C-level executives to align Snowflake strategy with business goals, and evaluate emerging trends for innovation. Preferred skills for this position include knowledge of DataOps, MLOps, and CI/CD pipelines, as well as familiarity with DBT, Airflow, SSIS, and IICS.,
ACTIVELY HIRING
posted 1 day ago
experience2 to 6 Yrs
location
Hyderabad, Telangana
skills
  • SQL
  • Stakeholder management
  • Excel skills
  • Data visualization tools
  • SAP proficiency
Job Description
As a Sourcing Analyst at Solenis in Hyderabad, India, you will play a crucial role in enhancing data analytics quality and establishing reporting mechanisms for spend and savings data analysis. Your responsibilities will include: - Developing dashboards, reports, and visualizations to convey insights to procurement teams. - Creating and maintaining KPI tracking for procurement performance evaluation. - Implementing automation and advanced technologies like AI for improved reporting. - Utilizing advanced analytics techniques such as predictive modeling, clustering, and optimization to identify value opportunities. Qualifications required for this role: - Proficiency in written and spoken English. - Strong Excel skills. - Experience with data visualization tools (e.g., Power BI, Tableau, Qlik). - Working with large datasets. - SQL or similar tools for data analysis. - SAP proficiency with direct material experience preferred. - Excellent communication and stakeholder management skills. Solenis, a leading global provider of water and hygiene solutions, values diversity and inclusivity, recognizing its people as the greatest asset. They offer a vibrant work environment with world-class infrastructure and ample career development opportunities. If you are passionate about contributing to meaningful work in a stable and rapidly growing organization, Solenis welcomes your application.,
ACTIVELY HIRING
posted 2 days ago

BD&I Data Scientist

Hitachi Careers
experience5 to 9 Yrs
location
Hyderabad, Telangana
skills
  • Machine Learning
  • Statistical Modeling
  • Python
  • SQL
  • Unsupervised Learning
  • Deep Learning
  • Natural Language Processing
  • Statistical Concepts
  • AWS
  • Azure
  • GCP
  • Businessfocused Analytics
  • Azure ML
  • Databricks
  • Supervised Learning
  • Data Science Libraries
  • Scikitlearn
  • TensorFlow
  • PyTorch
  • Data Analysis Methodologies
  • Big Data Technologies
  • Cloud Computing Platforms
Job Description
Role Overview: You are a skilled Data Scientist with expertise in machine learning, statistical modeling, and business-focused analytics. Your role involves building predictive models, time-series forecasting, and optimization solutions using Python, SQL, and Azure ML or Databricks. Strong communication skills are essential to translate complex data into clear business insights. Key Responsibilities: - Analyze large, complex datasets to uncover actionable insights for driving business impact. - Design and implement predictive models, especially in areas like customer penetration optimization and regional performance forecasting. - Apply machine learning techniques (supervised and unsupervised learning) to solve critical business problems such as classification, regression, and clustering. - Engineer meaningful features from raw data to optimize model performance. - Conduct exploratory data analysis (EDA) to identify trends, anomalies, and business opportunities. - Conduct feature selection and hyperparameter tuning to enhance model performance. - Validate model results using appropriate statistical and ML evaluation metrics. - Collaborate with data engineers to build and refine robust data pipelines. - Translate insights into clear, actionable strategies by working closely with product, engineering, and business teams. - Utilize visualization tools like Tableau, Power BI, and Google Analytics to communicate insights and model outcomes. - Build models for production deployment using tools like Spring Boot and GitLab. - Consider partner-specific (JV & Wholesale) and regional variations to fine-tune analytics approaches. - Stay updated with ML/AI innovations and suggest tool or process improvements. - Participate in Agile ceremonies and manage tasks using tools such as Jira. - Document methodologies and maintain version control for all modeling projects. Qualification Required: - Bachelor's degree in Computer Science or a related field. - Master's degree or equivalent advanced degree preferred. - Proven track record of delivering data science projects from ideation to production. - Strong communication skills and the ability to tell compelling stories with data. - Comfortable working with both structured and unstructured data sets. Additional Company Details: GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner to the world's largest and most forward-thinking companies. Since 2000, GlobalLogic has been pioneering the digital revolution by creating innovative and widely used digital products and experiences. The company collaborates with clients to transform businesses and redefine industries through intelligent products, platforms, and services. GlobalLogic prioritizes a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and integrity as key values for employees and clients.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Hyderabad, All India
skills
  • Python
  • Predictive Modelling
  • Azure
  • AWS
  • SDLC
  • Agile methodologies
  • PySpark
  • MLflow
  • DataBricks
  • Delta Lake
  • Unity Catalog
  • Databricks Runtime for ML
  • Spark SQL
  • Feature Store
Job Description
Role Overview: You will be responsible for designing and deploying predictive models, building ML pipelines, optimizing model performance, implementing Delta Lake, enabling CI/CD for ML pipelines, and troubleshooting issues in Spark Jobs and Databricks Environment. As a Senior Programmer Analyst in AI-ML Engineering, you are expected to have strong technical expertise and experience in Python, PySpark, MLflow, and predictive modeling techniques. Key Responsibilities: - Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML. - Build end-to-end ML pipelines (data ingestion, feature engineering, model training, deployment) on Databricks Lakehouse. - Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking. - Implement Delta Lake for scalable, ACID-compliant data workflows. - Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions. - Troubleshoot issues in Spark Jobs and Databricks Environment. Qualifications Required: - 3 to 5 years of experience in AI-ML Engineering. - Strong expertise in Python, PySpark, MLflow. - Experience with predictive modeling techniques including Classification, Clustering, Regression, timeseries, and NLP. - Knowledge in DataBricks is a plus. - Familiarity with cloud platforms like Azure/AWS, Delta Lake, and Unity Catalog. - Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark. - Understanding of MLflow, Feature Store, and Unity Catalog for governance. - Knowledge of SDLC and Agile methodologies. - Excellent oral and written communication skills. - Ability to work effectively in a team, proactive, and adaptive. Role Overview: You will be responsible for designing and deploying predictive models, building ML pipelines, optimizing model performance, implementing Delta Lake, enabling CI/CD for ML pipelines, and troubleshooting issues in Spark Jobs and Databricks Environment. As a Senior Programmer Analyst in AI-ML Engineering, you are expected to have strong technical expertise and experience in Python, PySpark, MLflow, and predictive modeling techniques. Key Responsibilities: - Design and deploy predictive models (e.g., forecasting, churn analysis, fraud detection) using Python/SQL, Spark MLlib, and Databricks ML. - Build end-to-end ML pipelines (data ingestion, feature engineering, model training, deployment) on Databricks Lakehouse. - Optimize model performance via hyperparameter tuning, AutoML, and MLflow tracking. - Implement Delta Lake for scalable, ACID-compliant data workflows. - Enable CI/CD for ML pipelines using Databricks Repos and GitHub Actions. - Troubleshoot issues in Spark Jobs and Databricks Environment. Qualifications Required: - 3 to 5 years of experience in AI-ML Engineering. - Strong expertise in Python, PySpark, MLflow. - Experience with predictive modeling techniques including Classification, Clustering, Regression, timeseries, and NLP. - Knowledge in DataBricks is a plus. - Familiarity with cloud platforms like Azure/AWS, Delta Lake, and Unity Catalog. - Hands-on experience with Databricks Runtime for ML, Spark SQL, and PySpark. - Understanding of MLflow, Feature Store, and Unity Catalog for governance. - Knowledge of SDLC and Agile methodologies. - Excellent oral and written communication skills. - Ability to work effectively in a team, proactive, and adaptive.
ACTIVELY HIRING
posted 1 month ago

Snowflake Lead / Architect

Blumetra Solutions
experience4 to 12 Yrs
location
Hyderabad, Telangana
skills
  • snowflake
  • airflow
  • dbt
  • snowsql
Job Description
As a Snowflake Lead/Architect, you will be responsible for designing a high-performance Snowflake architecture to serve as the foundation for the enterprise analytics strategy. Your key responsibilities will include: - Architecting scalable Snowflake models including facts, conformed dimensions, and semantic views - Implementing Row-Level Security (RLS), masking, and Single Sign-On (SSO)-based access governance - Optimizing performance through clustering, query insights, and autoscaling warehouses - Integrating metadata and lineage using tools like Atlan for enterprise visibility - Collaborating with BI, Data Engineering, and Governance teams to ensure metric parity across the stack To qualify for this role, you should have: - 8+ years of experience in data engineering/warehousing with at least 4 years working with Snowflake - Deep expertise in modeling, query optimization, and governance - Strong proficiency in SQL, SnowSQL, and pipeline orchestration tools such as dbt, Airflow, ADF - Familiarity with Atlan, Collibra, or Power BI integration If you are passionate about data architecture, performance tuning, and governed analytics at scale, we would love to connect with you. This role offers a 6+ months Contract to Hire opportunity for individuals with 8-12+ years of experience. Your skills in snowflake, airflow, snowsql, and dbt will be key in this role.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter