snowflake jobs in gurgaon, Gurgaon

82 Snowflake Jobs in Gurgaon

Toggle to save search
posted 2 months ago

Lead - Quality Tester

Ameriprise Financial Services, LLC
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Test analysis
  • Test design
  • Data acquisition
  • Data transformation
  • Tableau
  • SQL
  • AWS
  • Snowflake
  • SQL server
  • Defect reporting
  • analysis
  • Interacting with business
  • technology teams
  • Investment management data testing
  • Data quality checks
  • BI visualization tools
  • Data Bricks
  • Aladdin
  • Aladdin Data Cloud ADC
Job Description
In this role at Ameriprise, you will be responsible for testing activities within the Threadneedle Asset Managements Global Transformation and Change department, focusing primarily on UAT. Your role will involve collaborating with test management, project managers, business analysts, and business units to ensure the effectiveness of testing efforts. You will be expected to understand key program/project testing requirements, follow test plans using provided frameworks and templates, and contribute positively in project discussions by analyzing project documentation. **Key Responsibilities:** - Support project schedule and resource model related to testing efforts - Review requirements documentation and provide feedback for clarity and testability - Develop test cases in collaboration with business and project teams - Lead and execute test cases to validate documented requirements - Maintain records of test execution results using JIRA for project and audit purposes - Document and track defects through to completion using JIRA - Contribute to reporting on test progress, defect statistics, and resolution - Identify testing issues, take corrective actions, and escalate when necessary - Contribute to end of phase reports and project phase end review sessions **Key Capabilities - Essentials:** - Bachelor's/Master's degree or equivalent - 5-7 years of relevant testing experience - Strong understanding of investment management data and data testing - Experience with test management/defect tracking tools like JIRA, Quality Centre, ADO, etc. - Strong analytical skills with experience in producing and executing test scenarios and cases - Ability to work collaboratively, show a positive attitude, and actively solve issues - Self-starter with strong time management skills, able to work under pressure - Skilled in understanding business requirements, building relationships, and interpreting data - Ability to work independently, manage priorities, and meet deadlines - Strong communication, presentation, and stakeholder management skills **Additional Knowledge And Skills:** - Experience in data acquisition, transformation, quality checks, presentation, Tableau, and BI tools - Experience in investment risk data and risk systems - Strong SQL skills and experience with AWS, Snowflake, Data Bricks, SQL server - Familiarity with Aladdin/Aladdin Data Cloud (ADC) In addition to the above responsibilities and qualifications, Ameriprise India LLP has been providing client-based financial solutions globally for 125 years. They focus on Asset Management, Retirement Planning, and Insurance Protection. Join a collaborative and inclusive culture that rewards contributions and offers opportunities for career growth. Work with talented individuals who share your passion for doing great work and making a difference in the community. This is a full-time position with timings from 2:00 pm to 10:30 pm in the AWMP&S President's Office under the Business Support & Operations job family group.,
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago
experience3 to 8 Yrs
location
Noida, Uttar Pradesh
skills
  • Architecting
  • GCP
  • AWS
  • Azure
  • Python
  • Snowflake
  • Kafka
  • DBT
  • Airflow
  • Kubernetes
  • Data Enterprise Architect
  • Designing Data pipelines
  • Data Integrations
  • Big Data Solutions
  • GCP pubsub
  • Cloud Dataflow
Job Description
As an experienced Data Enterprise Architect at ShyftLabs, you will be a key member of our dynamic team that collaborates with various IT groups to define architectures and lead projects across functional agency systems and other state entities. You will have the opportunity to work under minimal supervision, utilizing your initiative and independent judgment extensively. **Key Responsibilities:** - Plan and analyze user requirements, procedures, and problems to automate processing or improve existing systems - Prepare As-Is, To Be, Gap analysis, SWOTs, charts, diagrams, and tables to depict present and proposed systems - Drive and lead initiatives across functional agency systems, IT groups, and state entities during project development and implementation stages - Provide expertise in determining and implementing architectural solutions - Evaluate current and future solutions, applications, technologies, and establish planning objectives - Develop and oversee design procedures, program codes, test procedures, and quality standards - Translate business strategy into technical strategy and define end-to-end technology architectures - Articulate desired future state, understand current state, identify gaps, and develop approaches to close these gaps - Own and drive forward MDM, Data Governance, Big Data, and Cloud Data Management - Develop data integration processes using ETL platforms like Biqquery, Informatica, Snowflake - Collaborate with the Enterprise Architecture team on standards, product roadmap, and architecture decisions - Monitor CDI services and components to maintain high performance, throughput, and availability - Collaborate with global team members to deliver MDM/Governance team deliverables and drive continuous improvement - Collaborate closely with Data Engineering and Product team to execute the set roadmap of data ingestion, integration, and transformation **Qualifications Required:** - Minimum 8 years of experience in Architecting and Designing Data pipelines and Integrations - Minimum 3 years of experience in Architecting and managing large data sets and CDI platforms - Minimum 3 years of experience in developing Big Data Solutions and public cloud implementation (GCP, AWS, Azure) - Minimum 3 years of experience using GCP pub-sub for building data integration pipelines - Minimum 6 years of experience in developing Python - Minimum 3 years of experience in cloud dataflow and other GCP tools for data engineering - Expertise in data models, data pipeline concepts, and cloud-based infrastructure disciplines - Deep working knowledge of data technologies such as Snowflake, Kafka, DBT, Airflow, Kubernetes, etc - In-depth experience in data ingestion/pipelines, migration, testing, validation, cleansing, and modeling At ShyftLabs, we offer a competitive salary and a strong insurance package. We are committed to the growth and development of our employees, providing extensive learning and development resources.,
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
Noida, Uttar Pradesh
skills
  • Data Warehousing
  • Data Integration
  • Data Governance
  • ETL tools
  • AWS
  • Azure
  • GCP
  • Snowflake
  • Data Lakes
  • Presales experience
  • Cloud platforms
Job Description
As a data architecture leader, you will play a crucial role in driving data warehousing and data center architecture initiatives. Your expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance, combined with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake, will be highly valuable. Your responsibilities will encompass technical leadership, presales support, and managing complex enterprise deals across different regions. **Key Responsibilities:** - Architect and design scalable Data Warehousing and Data Lake solutions - Lead presales engagements and manage the RFP/RFI/RFQ lifecycle - Develop and present compelling proposals and solution designs to clients - Collaborate with cross-functional teams to deliver end-to-end solutions - Estimate efforts and resources for customer requirements - Drive Managed Services opportunities and enterprise deal closures - Engage with clients in MEA, APAC, US, and UK regions - Ensure solutions align with business goals and technical requirements - Maintain high standards of documentation and presentation for client-facing materials **Qualification Required:** - Bachelor's or Master's degree in Computer Science, Information Technology, or related field - Certifications in AWS, Azure, GCP, or Snowflake are advantageous - Experience in consulting or system integrator environments - Proficiency in Data Warehousing, Data Lakes, Data Integration, and Data Governance - Hands-on experience with ETL tools (e.g., Informatica, Talend, etc.) - Exposure to cloud environments: AWS, Azure, GCP, Snowflake - Minimum 2 years of presales experience with an understanding of presales operating processes - Experience in enterprise-level deals and Managed Services - Ability to handle multi-geo engagements effectively - Excellent presentation and communication skills - Strong understanding of effort estimation techniques for customer requirements,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

Database Architect

Unlink Technologies Private limited
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • PostgreSQL
  • Kafka
  • Snowflake
  • Airflow
  • dbt
  • Python
  • AWS
  • GCP
  • Azure
  • Debezium
  • ClickHouse
  • BigQuery
  • Dagster
Job Description
As a Data Platform / Database Architect specializing in Postgres and Kafka, you will play a crucial role in designing and optimizing a high-throughput, audit-friendly data platform supporting a SaaS for financial data automation and reconciliation. Your responsibilities will encompass owning the end-to-end design and performance of the data platform, including multitenant Postgres schemas, CDC pipelines, and analytics stores. Additionally, you will be instrumental in laying the groundwork for AI-powered product features. Key Responsibilities: - Design multitenant Postgres schemas with a focus on partitioning, indexing, normalization, and RLS, while defining retention and archival strategies. - Optimize Postgres performance by implementing strategies such as EXPLAIN/ANALYZE, connection pooling, vacuum/bloat control, query/index tuning, and replication. - Develop event streaming/CDC using Kafka/Debezium, including topics, partitions, schema registry setup, and data delivery to analytics engines like ClickHouse, Snowflake, and BigQuery. - Model analytics layers using star/snowflake schemas, orchestrate jobs with tools like Airflow/Dagster, and implement dbt-based transformations. - Establish observability and SLOs for data by monitoring query/queue metrics, implementing tracing, setting up alerting systems, and conducting capacity planning. - Implement data security measures including encryption, masking, tokenization of PII, and defining IAM boundaries to enhance the audit posture, such as PCI compliance. - Integrate AI components like vector embeddings (pgvector/Milvus), basic feature store patterns (Feast), retrieval pipelines, and metadata lineage. - Collaborate with backend, ML, and product teams to review designs, coach engineers, create documentation/runbooks, and lead migrations. Qualifications Required: - Minimum of 6 years of experience in building high-scale data platforms with in-depth expertise in PostgreSQL, covering areas like partitioning, advanced indexing, query planning, and replication/HA. - Hands-on experience with Kafka (or equivalent) and Debezium/CDC patterns, familiarity with schema registry setups (Avro/Protobuf), and understanding of exactly-once/at-least-once delivery semantics. - Proficiency in at least one analytics engine at scale such as ClickHouse, Snowflake, or BigQuery, along with strong SQL skills. - Working knowledge of Python for data tooling (e.g., pydantic, SQLAlchemy), experience in orchestration with Airflow or Dagster, and expertise in implementing transformations using dbt. - Solid experience in cloud environments like AWS/GCP/Azure, including networking, security groups/IAM, secrets management, and cost controls. - A pragmatic approach to performance engineering, excellent communication skills, and a knack for thorough documentation.,
ACTIVELY HIRING
posted 2 months ago

Full Stack Tech Lead

Hitachi Careers
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Node
  • JavaScript
  • Python
  • Angular
  • git
  • GitHub
  • Jenkins
  • DynamoDB
  • AWS
  • Athena
  • Snowflake
  • TypeScript
  • React
  • Terraform
  • Cloud Formation
  • EC2
  • SNS
  • Step functions
  • Redshift
  • S3
Job Description
Role Overview: You will be working on a huge software project for a world-class company providing M2M / IoT 4G/5G modules to industries like automotive, healthcare, and logistics. Your role will involve contributing to the development of end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Key Responsibilities: - Participate in all aspects of agile software development, including design, implementation, and deployment - Architect and provide guidance on building end-to-end systems optimized for speed and scale - Work with technologies such as Node, JavaScript, TypeScript, Python, React, Angular, git, GitHub, Jenkins, Terraform, Cloud Formation, DynamoDB, AWS (EC2, SNS, Step functions, DynamoDB, Redshift, Athena, Snowflake, S3) Qualifications Required: - Experience as a full stack developer with a readiness to work with new technologies and architectures - Proficiency in web frameworks, APIs, databases, and multiple back-end languages At GlobalLogic, you will experience a culture of caring that prioritizes putting people first. You will have opportunities for continuous learning and development to grow personally and professionally. You will work on interesting and meaningful projects, collaborating with global clients to bring impactful solutions to market. The organization values balance and flexibility, providing various work arrangements to help you achieve a harmonious work-life balance.,
ACTIVELY HIRING
posted 2 months ago
experience4 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • C
  • WinForms
  • HTML
  • CSS
  • JavaScript
  • RESTful APIs
  • SDLC
  • Agile methodologies
  • Microservices
  • SQL Server
  • Snowflake
  • Git
  • Azure DevOps
  • NET Framework
  • ASPNET
  • Azure Cloud Platform
  • Azure Services
  • Azure Functions
  • React
  • DockerKubernetes
  • CICD pipelines
Job Description
As a .NET Developer with 4-10 years of experience, you will play a crucial role in developing cutting-edge solutions and driving innovation in the digital space. Your responsibilities will include: - Having a solid background in .NET Framework, C#, and ASP.NET - Developing desktop applications using WinForms - Hands-on experience with Azure Cloud Platform, including Azure Services and Azure Functions - Proficiency in front-end skills such as HTML, CSS, JavaScript, and React - Strong understanding of RESTful APIs, SDLC, and Agile methodologies - Demonstrating excellent problem-solving and collaboration abilities Preferred extras for this role include familiarity with Microservices, Docker/Kubernetes, experience with SQL Server and Snowflake, knowledge of CI/CD pipelines, Git, and Azure DevOps. Exposure to investment management technologies would be a plus. If you are passionate about technology and want to contribute to innovative projects, we invite you to join our team. Feel free to drop your resume or reach out directly via DM if you are interested or know someone who might be a great fit.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Qlik Sense
  • Tableau
  • SAP Business Objects
  • Snowflake
  • Informatica
  • Alteryx
  • Kubernetes
  • Helm
  • Docker
  • Ignition
  • PowerBI
  • Matillion
  • HVR
  • Databricks
  • Linux command line
  • Cloud ops
  • ABAC
  • VDS
  • PING Federate
  • Azure Active Directory
  • Argo CD
  • Cloud networking
Job Description
Role Overview: As an Ignition Application Administrator at EY, you will be part of the Enterprise Services Data team, collaborating closely with platform administrators, developers, Product/Project Seniors, and Customers. Your primary responsibility will be administering the existing analytics platforms, with a specialization in Ignition while also having knowledge of other platforms like Qlik Sense, Tableau, PowerBI, SAP Business Objects, and more. Your role will involve diving into complex problems to find elegant solutions and communicating effectively with team members across various disciplines. Key Responsibilities: - Install and configure Ignition platform. - Monitor the Ignition platform, integrate with observability and alerting solutions, and suggest platform enhancements. - Troubleshoot and resolve Ignition platform issues. - Manage data source connections and asset libraries. - Identify and address system capacity issues. - Define best practices for Ignition deployment. - Integrate Ignition with other ES Data platforms and Business Unit installations. - Contribute to data platform architecture and strategy. - Research and propose alternative actions for problem resolution following best practices and application functionality. Qualifications Required: - Minimum of 3 years of experience in customer success or customer-facing engineering roles. - Experience with large-scale implementation in a complex solutions environment. - Proficiency in Linux command line. - Ability to analyze technical concepts and translate them into business terms. - Familiarity with software development process and methodologies. - Experience in cloud ops/Kubernetes application deployment and management. - Knowledge of Attribute-based Access Control, Virtual Director Services, and cloud platform architecture. - Excellent communication skills: interpersonal, written, and verbal. - BA/BS Degree in technology, computing, or related field; or equivalent work experience of 3+ years. Company Details: EY aims to build a better working world by creating long-term value for clients, people, and society while fostering trust in the capital markets. Through diverse teams across 150 countries, EY provides assurance and supports clients in growth, transformation, and operations in various sectors like assurance, consulting, law, strategy, tax, and transactions. EY teams focus on asking better questions to find innovative solutions to today's complex global challenges.,
ACTIVELY HIRING
posted 2 months ago
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • Celery
  • Docker
  • Kubernetes
  • GitHub
  • JIRA
  • MongoDB
  • PostgreSQL
  • Snowflake
  • Tableau
  • Kafka
  • GCS
  • FastAPI
  • PyTest
  • LangChain
  • LangGraph
  • TensorFlow
  • PyTorch
  • MSSQL
  • BigQuery
  • Terraform
  • GitHub Actions
  • VertexAI
  • BigQuery
  • GKE
  • DataFlow
Job Description
Role Overview: As a Principal Software Engineer, you will play a crucial role in designing, developing, and deploying advanced AI and generative AI-based products. Your responsibilities will include driving technical innovation, leading complex projects, and collaborating closely with cross-functional teams to deliver high-quality, scalable, and maintainable solutions. This role necessitates a strong background in software development, AI/ML techniques, and DevOps practices. Additionally, you will be expected to mentor junior engineers and contribute to strategic technical decisions. Key Responsibilities: - Advanced Software Development: Design, develop, and optimize high-quality code for complex software applications and systems, ensuring performance, scalability, and maintainability. Drive best practices in code quality, documentation, and test coverage. - GenAI Product Development: Lead end-to-end development of generative AI solutions, from data collection and model training to deployment and optimization. Experiment with cutting-edge generative AI techniques to enhance product capabilities and performance. - Technical Leadership: Own architecture and technical decisions for AI/ML projects. Mentor junior engineers, review code for adherence to best practices, and uphold a high standard of technical excellence within the team. - Project Ownership: Lead execution and delivery of features, manage project scope, timelines, and priorities in collaboration with product managers. Proactively identify and mitigate risks to ensure successful, on-time project completion. - Architectural Design: Contribute to the architectural design and planning of new features, ensuring scalable, reliable, and maintainable solutions. Engage in technical reviews with peers and stakeholders to promote a product suite mindset. - Code Review & Best Practices: Conduct thorough code reviews to ensure adherence to industry best practices in coding standards, maintainability, and performance optimization. Provide constructive feedback to support team growth and technical improvement. - Testing & Quality Assurance: Design and implement robust test suites to ensure code quality and system reliability. Advocate for test automation and CI/CD pipelines to streamline testing processes and maintain service health. - Service Health & Reliability: Monitor and maintain service health, utilizing telemetry and performance indicators to proactively address potential issues. Perform root cause analysis for incidents and drive preventive measures for improved system reliability. - DevOps Ownership: Take end-to-end responsibility for features and services in a DevOps model, ensuring efficient incident response and maintaining high service availability. - Documentation & Knowledge Sharing: Create and maintain comprehensive documentation for code, processes, and technical decisions. Promote knowledge sharing within the team to enable continuous learning and improvement. Qualifications Required: - Educational Background: Bachelors degree in Computer Science, Engineering, or a related technical field; Masters degree preferred. - Experience: 6+ years of professional software development experience, including significant exposure to AI/ML or GenAI applications. Demonstrated expertise in building scalable, production-grade software solutions. - Technical Expertise: Advanced proficiency in Python, FastAPI, PyTest, Celery, and other Python frameworks. Deep knowledge of software design patterns, object-oriented programming, and concurrency. - Cloud & DevOps Proficiency: Extensive experience with cloud technologies (e.g., GCP, AWS, Azure), containerization (e.g., Docker, Kubernetes), and CI/CD practices. Strong understanding of version control systems (e.g., GitHub) and work tracking tools (e.g., JIRA). - AI/GenAI Knowledge: Familiarity with GenAI frameworks (e.g., LangChain, LangGraph), MLOps, and AI lifecycle management. Experience with model deployment and monitoring in cloud environments. Additional Company Details: UKG is on the verge of significant growth, holding top positions in workforce management and human capital management globally. With a focus on AI-powered products catering to customers of all sizes and industries, UKG is dedicated to promoting diversity and inclusion in the workplace. (Note: The JD does not contain specific additional details about the company beyond what is mentioned above.),
ACTIVELY HIRING
posted 1 week ago

Data Engineer

Lorven Technologies Private Limited
experience6 to 10 Yrs
WorkRemote
location
Gurugram, Delhi+6

Delhi, Noida, Bangalore, Chennai, Kolkata, Pune, Mumbai City

skills
  • azure
  • data engineering
  • data
  • bricks
  • factory
  • .net
  • python
Job Description
Data Engineer Azure Data Factory, SQL, Python, Databricks, ETL  Data Engineer .Net + Azure Data Factory  We are seeking a skilled Data Engineer to join our team. The ideal candidate will be responsible for designing, implementing, and maintaining data systems and architectures that support our business needs. This role requires expertise in working with large datasets, cloud technologies, and advanced data platforms to ensure the availability, quality, and accessibility of data for business analytics and decision-making. Key Responsibilities: Design, build, and maintain data pipelines to collect, process, and store data efficiently from various sources. Work with cross-functional teams to understand business requirements and deliver data solutions that meet the needs of the organization. Optimize data storage and retrieval methods to enhance system performance and data quality. Integrate data from multiple sources (databases, data lakes, cloud storage, etc.) to build comprehensive data sets. Build and maintain data infrastructure, ensuring it is scalable, reliable, and secure. Implement and manage data governance policies, ensuring data accuracy, consistency, and compliance. Conduct data modeling and provide insights into the data through advanced analytics tools and reports. Perform data transformations and data wrangling tasks to prepare data for analysis. Troubleshoot data issues and collaborate with stakeholders to resolve technical challenges. Ensure the integrity of data pipelines and systems by conducting routine testing and validation. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Proven experience as a Data Engineer or in a similar role, with a minimum of [X] years of experience. Proficiency in data engineering tools and technologies such as SQL, Python, Java, Scala, and ETL frameworks. Experience working with cloud platforms (AWS, Azure, GCP) and data storage solutions (e.g., Redshift, Snowflake, BigQuery). Solid understanding of data modeling, database design, and cloud data architectures. Hands-on experience with data warehousing concepts and tools (e.g., Apache Hive, Apache Spark). Familiarity with data orchestration tools such as Apache Airflow, Azure Data Factory. Knowledge of real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus. Strong problem-solving skills and ability to troubleshoot complex data issues. Familiarity with data security best practices and data privacy regulations (e.g., GDPR, HIPAA).
INTERVIEW ASSURED IN 15 MINS
posted 2 weeks ago

Abinitio Developer

CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
experience5 to 10 Yrs
location
Gurugram, Delhi+7

Delhi, Noida, Bangalore, Chennai, Hyderabad, Kolkata, Pune, Mumbai City

skills
  • ab initio
  • unix shell scripting
  • sql
Job Description
Key Responsibilities Design, develop, and implement ETL processes using Ab Initio GDE (Graphical Development Environment). Build and maintain Ab Initio graphs, plans, and sandboxes for data extraction, transformation, and loading. Work with business teams to understand data integration requirements and deliver efficient solutions. Use Ab Initio EME for version control, dependency management, and metadata governance. Perform data profiling, data validation, and quality checks using Ab Initio components and tools. Optimize ETL workflows for performance, scalability, and maintainability. Implement robust error handling, restartability, and logging mechanisms. Collaborate with DBAs, data modelers, and analysts to ensure data accuracy and consistency. Schedule and monitor jobs using Ab Initio Control Center (AICC) or enterprise schedulers. Support production systems, troubleshoot issues, and perform root cause analysis. Required Technical Skills Strong hands-on experience in Ab Initio GDE, EME, Co>Operating System, and Control Center. Proficiency with Ab Initio components such as Input/Output, Transform, Partition, Sort, Join, Lookup, Rollup, Reformat, Scan, and Dedup Sort, along with error handling using Rejects, Error Tables, and Error Ports for robust ETL design. Expertise in ETL design, development, and deployment for large-scale data environments. Proficiency in SQL and relational databases such as Oracle, Teradata, DB2, or SQL Server. Experience with UNIX/Linux shell scripting for automation and workflow integration. Understanding of data warehousing concepts (star schema, snowflake schema, slowly changing dimensions). Strong performance tuning and debugging skills in Ab Initio. Familiarity with data quality, metadata management, and data lineage.  
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago

Power BI Architect

iLink Digital
experience10 to 14 Yrs
location
Noida, Uttar Pradesh
skills
  • Business Intelligence
  • Data Engineering
  • Solution Architecture
  • Power BI
  • Data Modeling
  • DAX
  • Sales Analytics
  • Data Governance
  • Security
  • Compliance
  • Power Query
  • Datawarehouse
  • AI
  • GenAI systems
  • Microsoft CoPilot
  • Azure AI services
  • LLMbased solutions
  • CRM integrations
  • Enterprise Reporting Systems
Job Description
As an experienced professional in Business Intelligence, Data Engineering, or Solution Architecture, your role will involve the following key responsibilities: - Analyze current sales workflows, KPIs, and reporting frameworks. - Assess GenAI system capabilities, data readiness, and integration requirements with CRM and other enterprise platforms. - Conduct technical discovery sessions to align business goals with data architecture. - Partner with stakeholders to define high-value GenAI-driven use cases such as HCP insights, next-best-action recommendations, cross-sell / upsell triggers, competitive intelligence, and objection handling. - Translate business needs into scalable AI/BI architecture requirements. - Design end-to-end integration architecture, embedding GenAI and CoPilot into Power BI reports and sales processes. - Architect data pipelines connecting multiple source systems into Snowflake for centralized reporting. - Ensure alignment with security, governance, and compliance standards. - Provide technical guidance on data modeling, DAX, semantic models, and performance optimization in Power BI. - Develop proof-of-concept (POC) and prototype solutions for top-priority use cases. - Collaborate with field reps and sales teams to validate solution effectiveness. - Gather feedback, measure business impact, and refine architecture before scaling. Qualifications and Skills Required: - 10+ years of experience in Business Intelligence, Data Engineering, or Solution Architecture. - Expertise in Power BI including data modeling, DAX, Power Query, custom visuals, and performance tuning. - Strong hands-on experience with any Datawarehouse including data pipelines, integrations, and optimization. - Knowledge of AI/GenAI systems, Microsoft CoPilot, Azure AI services, or LLM-based solutions. - Proven experience in solution architecture for sales analytics, CRM integrations, and enterprise reporting systems. - Solid understanding of data governance, security, and compliance (GDPR, HIPAA, etc.). - Ability to drive discovery workshops, solution design sessions, and technical stakeholder alignment. - Strong communication and presentation skills with both technical and business stakeholders. Please note that additional details about the company were not provided in the job description.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Gurugram, All India
skills
  • Python
  • SQL
  • dbt
  • Git
  • Airflow
  • Kafka
  • Avro
  • Power BI
  • Tableau
  • pandas
  • PySpark
  • CICD pipelines
  • Prefect
  • Kinesis
  • Azure Event Hub
  • Delta Lake
  • Parquet
  • Looker
Job Description
As a Data Engineer Senior Consultant at EY, you will be responsible for designing, building, and optimizing data solutions to enable advanced analytics and AI-driven business transformation. Your role will involve working with modern data engineering practices and cloud platforms to deliver robust and scalable data pipelines for various business domains such as finance, supply chain, energy, and commercial operations. Key Responsibilities: - Design, develop, and deploy end-to-end data pipelines for complex business problems, supporting analytics and modernizing data infrastructure. - Design and implement data models, ETL/ELT workflows, and data integration solutions across structured and unstructured sources. - Collaborate with AI engineers, data scientists, and business analysts to deliver integrated solutions that unlock business value. - Ensure data quality, integrity, and governance throughout the data lifecycle. - Optimize data storage, retrieval, and processing for performance and scalability on cloud platforms like Azure, AWS, GCP, Databricks, and Snowflake. - Translate business requirements into technical data engineering solutions, including architecture decisions and technology selection. - Contribute to proposals, technical assessments, and internal knowledge sharing. - Perform data preparation, feature engineering, and MLOps activities to support AI initiatives. Qualifications Required: - Degree or equivalent certification in Computer Science, Data Engineering, Information Systems, Mathematics, or related quantitative field. - Proven experience in building and maintaining large-scale data pipelines using tools such as Databricks, Azure Data Factory, Snowflake, or similar. - Strong programming skills in Python and SQL, with proficiency in data engineering libraries like pandas, PySpark, and dbt. - Deep understanding of data modeling, ETL/ELT processes, and Lakehouse concepts. - Experience with data quality frameworks, data governance, and compliance requirements. - Familiarity with version control (Git), CI/CD pipelines, and workflow orchestration tools such as Airflow and Prefect. Soft Skills: - Strong analytical and problem-solving mindset with attention to detail. - Good team player with effective communication and storytelling using data and insights. - Consulting skills, including the development of presentation decks and client-facing documentation. Preferred Criteria: - Experience with real-time data processing tools like Kafka, Kinesis, and Azure Event Hub. - Knowledge of big data storage solutions such as Delta Lake, Parquet, and Avro. - Experience with data visualization tools like Power BI, Tableau, and Looker. - Understanding of AI/ML concepts and collaboration with AI teams. Preferred Qualifications: - Certifications such as Databricks Certified Data Engineer Professional, Azure Data Engineer Associate, AWS Certified Data Analytics Specialty, and SnowPro Advanced: Data Engineer. At EY, the focus is on building a better working world by creating new value for clients, people, society, and the planet while building trust in capital markets. EY teams leverage data, AI, and advanced technology to help clients shape the future with confidence and address the most pressing issues of today and tomorrow. With a diverse ecosystem of partners and a globally connected network, EY offers services across assurance, consulting, tax, strategy, and transactions in more than 150 countries and territories. As a Data Engineer Senior Consultant at EY, you will be responsible for designing, building, and optimizing data solutions to enable advanced analytics and AI-driven business transformation. Your role will involve working with modern data engineering practices and cloud platforms to deliver robust and scalable data pipelines for various business domains such as finance, supply chain, energy, and commercial operations. Key Responsibilities: - Design, develop, and deploy end-to-end data pipelines for complex business problems, supporting analytics and modernizing data infrastructure. - Design and implement data models, ETL/ELT workflows, and data integration solutions across structured and unstructured sources. - Collaborate with AI engineers, data scientists, and business analysts to deliver integrated solutions that unlock business value. - Ensure data quality, integrity, and governance throughout the data lifecycle. - Optimize data storage, retrieval, and processing for performance and scalability on cloud platforms like Azure, AWS, GCP, Databricks, and Snowflake. - Translate business requirements into technical data engineering solutions, including architecture decisions and technology selection. - Contribute to proposals, technical assessments, and internal knowledge sharing. - Perform data preparation, feature engineering, and MLOps activities to support AI initiatives. Qualifications Required: - Degree or equivalent certification in Computer Science, Data Engineering, Information System
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Data Science
  • Advanced Analytics
  • Python
  • HTML
  • Java
  • MongoDB
  • MySQL
  • Power BI
  • Tableau
  • Excel
  • Data Visualization
  • Quantitative Analysis
  • Business Intelligence
  • Database Management
  • JSON
  • AI Model Development
  • Nodejs
  • PostgresSQL
  • CRM Platforms
  • Data Integration Tools
Job Description
Role Overview: As a passionate and creative Data Scientist and AI specialist at Synopsys, you will be responsible for transforming data into actionable insights to elevate customer success processes and products. You will collaborate with cross-functional teams to define project requirements, develop and optimize generative AI models, and analyze large datasets to drive improvements in CRM and customer support systems. Your role will involve preparing reports, forecasting data trends, designing and implementing AI models, establishing best practices, and communicating insights effectively to diverse stakeholders. Key Responsibilities: - Prepare and maintain Excel and Power BI reports to analyze usage, productivity, and recognition metrics. - Forecast data trends, generate leads, and identify gaps for focused enhancements in customer success processes. - Design, develop, and implement generative AI models and algorithms in partnership with R&D and cross-functional teams. - Optimize existing GenAI models for improved performance, scalability, and efficiency. - Establish and promote best practices and standards for generative AI development within the organization. - Consolidate and analyze unstructured data sources to generate actionable insights for CRM product enhancement. - Develop, code, and automate processes to cleanse, integrate, and evaluate large datasets from multiple sources. - Interpret and communicate insights to product, service, and business managers for informed decision-making. - Build system-generated reports, dashboards, and reporting tools to support data informatics and business intelligence. - Evaluate agentic support models for ROI and integrate them into CRM systems. - Design, develop, and test web-based prototypes and applications, focusing on user interface and experience. - Maintain high-quality, responsive web pages using HTML, CSS, Java, Node.js, and collaborate with IT and product teams to deliver functional web solutions. Qualifications Required: - Proficiency in data science, AI model development, and advanced analytics. - Experience with CRM platforms (Salesforce), data integration tools (Coveo, SharePoint, Snowflake), and data visualization (Power BI, Tableau, Advanced Excel). - Strong programming skills in Python, HTML, Java, Node.js, and database management (PostgresSQL, MongoDB, MySQL, JSON). - Ability to design and develop web-based prototypes, dashboards, and reporting tools. - Experience in quantitative analysis, business intelligence, and communicating insights to diverse audiences. (Note: The section regarding company details has been omitted as it was not explicitly provided in the job description.),
ACTIVELY HIRING
posted 3 weeks ago

Core Data Product Analyst

Clearwater Analytics
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Trade
  • Analytics
  • Pricing
  • Accounting
  • Performance
  • Risk
  • capital markets
  • equity
  • fixed income
  • derivatives
  • SQL
  • Excel
  • data governance
  • Security Reference
  • Custody data domains
  • data visualization tools
  • data management principles
  • data quality processes
Job Description
As a Core Data Product Analyst at our company, you will play a crucial role in analyzing and interpreting complex data sets to drive strategic decision-making and product development for our investment data management platform. Reporting to a senior product leader, you will collaborate closely with internal stakeholders and clients to ensure that our data management solutions meet user needs and business objectives. Your insights and contributions will help identify trends, inform our product roadmap, and enhance user experience. Your key responsibilities will include: - Collaborating with product managers to define data requirements and support the development of analytical models. - Conducting data analysis to identify trends, issues, and opportunities for improving product performance. - Generating reports and dashboards to provide actionable insights to stakeholders. - Advocating for a culture of data-driven decision-making within the organization. - Working with engineering teams to maintain data integrity and accuracy in product development. - Supporting the prioritization of product features based on data analysis and user feedback. To excel in this role, you must possess: - Prior experience in the Investment management industry with a strong knowledge of Security Reference, Trade, Analytics, Pricing, Accounting (ABOR/IBOR), Performance, Risk, and Custody data domains. - Knowledge of capital markets and expertise in equity, fixed income, and derivatives. - 3-6 years of experience in data analysis or a related field, preferably within the investment management industry. - Strong understanding of data management principles, including data governance and data quality processes. - Proficiency with SQL, Excel, and data visualization tools (e.g., Tableau, Power BI). - Familiarity with capital markets and financial data domains. - Excellent analytical problem-solving skills, attention to detail, and strong communication skills. - Ability to work effectively in a fast-paced environment and manage multiple projects simultaneously. - Experience with cloud platforms (AWS/Azure/GCP) and relational or NoSQL databases is a plus. - Knowledge of Snowflake is appreciated. Your educational background should include a Bachelor's degree in Data Science, Computer Science, Statistics, or a related field. A Master's degree is considered a plus. Nice to have qualifications include experience in alternative asset management or related financial services, knowledge of machine learning techniques in data analytics, and exposure to working with portfolio managers, traders, and researchers. If you are passionate about leveraging data to drive strategic decisions and product development in the investment management industry, we encourage you to apply and be a part of our dynamic team.,
ACTIVELY HIRING
posted 6 days ago

Databricks consultant

Themesoft Inc.
experience10 to 14 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • SQL
  • DAB
  • Spark
  • Databricks
  • DBT Data Build Tool
  • Delta tables
Job Description
As a Databricks consultant at PAN India's Cognizant Office, with 10 to 12 years of experience, your role will involve the following responsibilities: - Build and maintain robust data pipelines using Python, Databricks, Spark, DBT, and Delta Lake to transform and deliver high quality analytics-ready datasets. - Design and implement efficient data models (star, snowflake, and dimensional schemas), choosing normalized or denormalized structures based on performance and use cases. - Write performant SQL and Databricks notebooks, manage versioning and orchestration, and ensure data quality and lineage across pipelines. - Leverage tools like Github actions/Jenkins to automate testing, validation, and deployments. - Implement best practices in data modeling, performance tuning, and scalable architecture and coding practices. Qualifications required for this role include proficiency in Python, Databricks, DBT (Data Build Tool), SQL, DAB, Spark, and Delta tables. Nice-to-have skills include experience with Collibra DQ, Collibra, and Jenkins. If you have experience with Autoloader and SAP integration for ingestion workflows, Collibra, and Power BI, it would be considered an added advantage.,
ACTIVELY HIRING
posted 2 days ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • SQL
  • Data integration
  • Python
  • Data Cleaning
  • Data handling
  • AI Models
  • Data Standardization
Job Description
Role Overview: At EY, you will have the opportunity to build a career tailored to your uniqueness, with global support, an inclusive culture, and cutting-edge technology. Your voice and perspective are valued to help EY continually improve. Join EY to create an exceptional experience for yourself and contribute to building a better working world for all. Key Responsibilities: - Perform data assessment for clients using various frameworks and assist them in developing a successful data strategy. - Collaborate closely with IT, supply chain, and other stakeholders to understand data analysis needs and translate them into technical requirements. - Develop strategies to validate, clean, transform, and monitor data to ensure accuracy, consistency, completeness, reliability, and timeliness for intended uses. - Identify new data to capture and standardize, oversee standard reports and dashboards, and ensure compliance with legal and other standards. - Maintain data documentation, collaborate with cross-functional teams, and stay updated on the latest data trends and technologies in supply chain management. - Participate in data governance projects from requirements gathering to deployment, perform data analysis and manipulation based on client requirements, and implement data-driven strategic initiatives. - Possess strong SQL, data integration, and handling skills, with exposure to AI models, Python, and data cleaning/standardization. Qualification Required: - Minimum 5 years of experience as a Data Quality/Cleansing Specialist with expertise in data quality/maturity frameworks. - Proficiency in Data Governance design and setup, data strategy roadmap development, and adoption. - Familiarity with data quality standards, data profiling, data quality assessment, data cleaning, monitoring and control, and data governance. - Experience with cloud databases like Snowflake, Azure, and Databricks. - Exceptional communication skills to build relationships with senior executives, superior stakeholder management skills, critical thinking, problem-solving abilities, and the capacity to translate complex solutions into clear content. - Ability to collect and identify business requirements and translate them into functional requirements and acceptance criteria. Additional Company Details: EY is dedicated to helping a diverse range of clients, from startups to Fortune 500 companies, with varied and inspiring projects. Education, coaching, and personal development are emphasized, along with opportunities for skills development and career progression. You will work in an interdisciplinary environment that values high quality and knowledge exchange, receiving support, coaching, and feedback from engaging colleagues. EY's mission is to build a better working world by creating long-term value for clients, people, and society, and fostering trust in the capital markets. Through data and technology, EY teams worldwide provide assurance and support clients in growth, transformation, and operations across various sectors.,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Power BI
  • Data modeling
  • DAX
  • SharePoint
  • Data visualization
  • Python
  • R
  • Data analysis
  • Agile development
  • Power Query
  • ETL processes
  • SQL databases
  • Azure storage
  • Microsoft Fabric
  • Rowlevel security
  • Power BI report optimization
  • Azure SQL
  • Power BI Desktop
  • Power BI Service
  • Charting technologies
Job Description
In this role as a Power Bi professional with Fabric expertise at GlobalLogic, you will be responsible for designing, developing, and maintaining Power BI reports and dashboards to derive actionable insights from assessment data. Your key responsibilities will include: - Working with data from Company's established ETL processes and other sources - Creating and optimizing data models within Power BI for accuracy and performance - Applying innovative visualization techniques to effectively communicate assessment analytics - Deploying and maintaining Power BI reports in a Microsoft Fabric environment - Collaborating with product management to prioritize deliveries and design improvements - Reviewing new analytics and reporting technologies with the product team - Understanding data context and reporting requirements with subject matter experts - Maintaining technical documentation and source control related to project design and implementation Qualifications required for this role include: - Bachelor's degree in computer science, analytics, or related field, or equivalent work experience - Minimum of 5 years of experience working with Power BI and Power Query, loading data from Azure and SharePoint - Strong DAX skills for writing complex calculations and statistical operations - Proficiency in data modeling including star and snowflake schema - Experience implementing row-level security and dynamic row-level security in Power BI reports - Familiarity with Power BI report performance optimization techniques - Practical experience with Azure SQL or similar databases for data exploration - Excellent communication skills and numerical abilities - Understanding of data protection principles - Ability to self-organize, prioritize tasks, and work collaboratively at all levels Preferred qualifications include experience in Python or R for analysis, familiarity with advanced charting technologies, knowledge of assessment data and psychometric principles, experience with Azure, and familiarity with Agile development practices. At GlobalLogic, you can expect a culture of caring, commitment to learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. As a trusted digital engineering partner, GlobalLogic collaborates with clients to transform businesses and redefine industries through intelligent products, platforms, and services.,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Power BI
  • Data modeling
  • DAX
  • Data modeling
  • Data validation
  • Python
  • R
  • Data visualization
  • SharePoint
  • Power Query
  • ETL processes
  • Rowlevel security
  • Azure SQL
  • Data exploration
  • Power BI Desktop
  • Power BI Service
  • SQL databases
  • Azure storage
  • Statistical operations
  • Data protection principles
  • Agile development practices
Job Description
Role Overview: You will be responsible for designing, developing, and maintaining Power BI reports and dashboards to derive actionable insights from assessment data. You will collaborate with product management and subject matter experts to ensure accurate and performant reports. Your role will involve deploying and maintaining Power BI reports in a Microsoft Fabric environment and utilizing innovative visualization techniques for clear communication of assessment analytics. Key Responsibilities: - Design, develop, and maintain Power BI reports and dashboards - Work with data from established ETL processes and other sources - Create and optimize data models within Power BI - Apply innovative visualization techniques for clear communication - Deploy and maintain Power BI reports in a Microsoft Fabric environment - Collaborate with product management and subject matter experts - Maintain technical documentation and source control Qualifications: - Bachelor's degree in computer science, analytics, or related field - Minimum 5 years of experience with Power BI and Power Query - Experience with data consumption from various sources including SQL databases, Azure storage, and SharePoint - Strong DAX skills with the ability to write complex calculations - Understanding of data modeling, including star and snowflake schema - Experience implementing row-level security in Power BI reports - Familiarity with Power BI report performance optimization techniques - Practical experience with Azure SQL for data exploration - Excellent communication skills and numerical skills - Understanding of data protection principles - Ability to self-organize, prioritize, and track tasks Company Details: GlobalLogic is a trusted digital engineering partner that collaborates with clients to transform businesses and industries through intelligent products, platforms, and services. The company prioritizes a culture of caring, continuous learning and development, interesting and meaningful work, balance, flexibility, and integrity. By joining GlobalLogic, you will be part of a high-trust organization that values integrity, truthfulness, and candor in everything it does.,
ACTIVELY HIRING
posted 2 months ago

Core Data Product Analyst

Clearwater Analytics (CWAN)
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Trade
  • Analytics
  • Pricing
  • Accounting
  • Performance
  • Risk
  • capital markets
  • equity
  • fixed income
  • derivatives
  • SQL
  • Excel
  • data governance
  • Security Reference
  • Custody data domains
  • data visualization tools
  • data management principles
  • data quality processes
Job Description
As a Core Data Product Analyst at our company, your primary role will involve analyzing and interpreting complex data sets to drive strategic decision-making and product development for our investment data management platform. You will report to a senior product leader and collaborate closely with internal stakeholders and clients to ensure that our data management solutions meet user needs and business objectives. Additionally, you will contribute to identifying trends and insights to inform our product roadmap and enhance user experience. **Key Responsibilities:** - Collaborate with product managers to define data requirements and support the development of analytical models. - Conduct data analysis to identify trends, issues, and opportunities for improving product performance. - Generate reports and dashboards to provide actionable insights to stakeholders. - Advocate for a culture of data-driven decision-making within the organization. - Work with engineering teams to maintain data integrity and accuracy in product development. - Support the prioritization of product features based on data analysis and user feedback. **Qualifications Required:** - Prior experience in the investment management industry with a strong knowledge of Security Reference, Trade, Analytics, Pricing, Accounting (ABOR/IBOR), Performance, Risk, Custody data domains. - Knowledge of capital markets and expertise in equity, fixed income, and derivatives. - 3-6 years of experience in data analysis or a related field, preferably within the investment management industry. - Strong understanding of data management principles, including data governance and data quality processes. - Proficiency with SQL, Excel, and data visualization tools (e.g., Tableau, Power BI). - Familiarity with capital markets and financial data domains. - Excellent analytical problem-solving skills and attention to detail. - Strong communication skills to convey complex information clearly to diverse audiences. - Ability to work effectively in a fast-paced environment and handle multiple projects simultaneously. - Experience with cloud platforms (AWS/Azure/GCP) and relational or NoSQL databases is a plus. - Knowledge of Snowflake is appreciated. **Education Background:** - Bachelor's degree in Data Science, Computer Science, Statistics, or a related field. Master's degree is a plus. In addition to the above requirements, it would be beneficial if you have: - Experience working in alternative asset management or related financial services. - Knowledge of machine learning techniques and their application in data analytics. - Exposure to working with portfolio managers, traders, and researchers.,
ACTIVELY HIRING
posted 2 months ago
experience8 to 12 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • SQL
  • Power BI
  • AWS
Job Description
As a Senior Data Analyst, your role will involve strategic leadership & management, technical oversight, team leadership, and project governance in the context of HRIS implementations and optimizations. Your key responsibilities will include: - Serve as the primary business-facing technical advisor for Workday HRIS implementations and optimizations - Develop comprehensive HRIS strategies and roadmaps aligned with business objectives - Lead discovery sessions to gather and analyze business requirements - Present technical solutions and recommendations to executive stakeholders - Manage client expectations and ensure high satisfaction throughout project lifecycle In terms of technical oversight, you will be required to: - Make critical design decisions regarding system configuration, integrations, and data management - Review and approve technical specifications, solution designs, and test plans - Identify and resolve complex technical challenges that arise during implementations - Ensure solutions adhere to best practices and governance requirements Your role will also involve team leadership responsibilities such as: - Providing direction and mentorship to a team of data engineers - Assigning tasks and monitoring progress to ensure on-time, high-quality deliverables - Facilitating knowledge sharing and professional development within the team Additionally, you will be responsible for project governance by: - Identifying and mitigating project risks and dependencies - Ensuring adherence to project methodology and quality standards - Driving continuous improvement in delivery processes and methodologies Qualifications and skills required for this role include: - 8+ years of experience in HRIS consulting, implementation, or administration - 5+ years of hands-on experience with Workday/Greenhouse, including multiple full lifecycle implementations - Experience with cloud data warehouse platforms, preferably Snowflake - Experience with architecting or working with ELT technologies (such as DBT) and data architectures - Workday certifications - including at least one advanced certification - Experience leading teams and managing complex HRIS projects - Deep understanding of HR processes, compliance requirements, and industry best practices - Strong client relationship management and executive communication skills - Proven problem-solving abilities and technical troubleshooting expertise Please note that the HRIS module Workday is mandatory for this role.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter