agile-modeling-jobs-in-baramati, Baramati

277 Agile Modeling Jobs nearby Baramati

Toggle to save search
posted 1 week ago

Senior Snowflake Data Engineer

Hucon Solutions India Pvt.Ltd.
Hucon Solutions India Pvt.Ltd.
experience8 to 13 Yrs
Salary8 - 18 LPA
location
Pune, Bangalore+7

Bangalore, Chennai, Noida, Hyderabad, Gurugram, Kolkata, Mumbai City, Delhi

skills
  • aws
  • sql
  • data modeling
  • snowflake
  • data build tool
  • five tran
Job Description
Senior Snowflake Data Engineer Location: PAN India Experience: 8+ Years Skills: Snowflake, dbt, FiveTran, Snowpipe, AWS (MWAA, S3, Lambda), GitHub CI/CD Job Description We are looking for an experienced Senior Snowflake Data Engineer with strong expertise in modern data warehousing, cloud technologies, and ELT pipeline development. The ideal candidate should have deep hands-on experience in Snowflake, dbt, cloud environments, and CI/CD practices, with the ability to design scalable and efficient data solutions. Key Responsibilities Analyze, integrate, model, and interpret large and complex datasets from multiple sources. Design and implement ELT data pipelines using dbt with Snowflake as the primary cloud data warehouse. Build efficient and scalable data transformation pipelines using dbt at an advanced level. Work with ETL/ELT and data governance tools such as FiveTran and Alation. Utilize advanced Snowflake features such as RBAC, Dynamic Tables, and various optimization techniques. Ensure strong data modelling and warehousing practices across diverse database technologies. Manage orchestrations using Apache Airflow or AWS MWAA, along with CI/CD pipelines. Oversee continuous deployment, monitoring, and operations of data solutions using GitHub Actions, Terraform, and other DevOps tools. Collaborate with technical and non-technical stakeholders through effective communication. Contribute to building future-state Data Warehouse capabilities using cutting-edge technologies. Adapt and work efficiently within Agile methodology. Preferred Qualifications 2+ years of hands-on experience with Snowflake as a Cloud Data Warehouse and Data Lake platform. Strong understanding of cloud environments, especially AWS (S3, Lambda, MWAA). Airline industry domain experience is a plus.
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 5 days ago
experience7 to 11 Yrs
location
Maharashtra
skills
  • Agile methodologies
  • Project management
  • Leadership
  • Data management
  • Data modeling
  • Data visualization
  • Communication
  • Presentation
  • Analytical skills
  • Scrum Master
  • Jira tool
  • Agile certification
  • Cloud platforms
  • Integration tools
  • Problemsolving
  • Agile transformation
  • Project Manager
Job Description
As an Agile Delivery Manager at Eaton, your primary responsibility will be to lead and manage Agile Data engineering projects in the supply chain domain. You will oversee product and software development teams to ensure adherence to Agile principles and facilitate Agile ceremonies. Collaborating with cross-functional teams and stakeholders, you will identify and mitigate risks, monitor project progress, and foster a culture of continuous improvement and Agile maturity. Your role will involve training, mentoring, and supporting Scrum teams, as well as assisting with roadmapping, planning, and capacity management. You will also be responsible for collecting, tracking, and reporting Agile and Lean metrics, communicating updates and risks to stakeholders, and supporting cost tracking and resource planning. Key Responsibilities: - Lead and manage Agile Data engineering projects in the supply chain domain. - Facilitate Agile ceremonies and ensure adherence to Agile principles. - Collaborate with cross-functional teams and stakeholders. - Identify and mitigate risks, and monitor project progress. - Foster a culture of continuous improvement and Agile maturity. - Train, mentor, and support Scrum teams. - Assist with roadmapping, planning, and capacity management. - Collect, track, and report Agile and Lean metrics. - Communicate updates and risks to stakeholders. - Support cost tracking and resource planning. In addition to the above responsibilities, you will be responsible for supporting data and application integration technology, engaging in technical discussions, and driving architectural decisions. You will ensure the scalability and reusability of data products, facilitate the planning of work involved in data management and integration delivery, and collaborate with data engineers and architects to align with data strategies. Qualifications: - Bachelor's degree - Advanced degree in a related field - Minimum 10 years of IT experience, 7+ years in a Scrum Master, Dev Manager, or Project Manager role Skills: - Strong understanding of Agile methodologies and project management principles - Experience with Agile software tools and producing Agile artifacts - Good understanding of Project Finance management - Strong understanding and hands-on experience with Jira tool - Certification in Agile (e.g., Certified ScrumMaster, PMI-ACP) - Experience in a leadership role within Agile transformation - Proven track record of delivering complex projects - Strong analytical and problem-solving skills - Knowledgeable in data management and integration - Experience with cloud platforms and data modeling - Proficiency in data visualization and integration tools - Strong communication and presentation skills - Patience and empathy working with mentees and people new to Agile - Comfortable working directly with both technical and non-technical audiences - Effective listening skills with the intention of understanding - Experience working with diverse, global cultures, organizations, and teams - Ability to influence and drive change at all levels,
ACTIVELY HIRING
posted 1 week ago
experience8 to 12 Yrs
location
Pune, Maharashtra
skills
  • SQL
  • Automation
  • Analytics
  • Data Analysis
  • Process Modeling
  • Data Analytics
  • ProblemSolving
  • AI
  • Agile Software Methodology
  • CICD Practices
Job Description
Role Overview: Are you interested in pursuing your career in Asset Management and working in a data-driven business environment As a Data and Business Process Analyst at UBS, your role will involve collaborating with a cross-functional team to identify inefficiencies, analyze existing workflows, and simplify processes to enhance operational efficiency, ensure compliance, and drive performance improvement. You will have the opportunity to leverage data analysis tools, process modeling, and problem-solving skills to redesign processes, implement changes aligned with business goals, and drive automation initiatives across various operation functions. Key Responsibilities: - Act as a data and business process analyst to identify inefficiencies and analyze existing workflows for simplification within operations - Utilize data analysis tools, process modeling, and problem-solving skills to identify bottlenecks, redesign processes, and implement changes aligned with business goals - Collaborate with different teams to understand their needs, identify areas for automation, and design the target operating model for transforming data and operations landscape - Leverage data analytics to identify patterns, trends, and anomalies in data, providing insights for business decisions and improving data quality - Contribute to the growth of AM Operations by researching AI/automation tools to enhance operational efficiency - Perform data collection and analysis, process mapping, requirements gathering, stakeholder management, process implementation, and change management activities Qualifications Required: - Minimum of 8 years of hands-on experience in data and business analysis with a focus on operation process optimization, preferably within the financial industry, especially Asset Management - Proficiency with SQL and automation/analytics tools for data extraction, analysis, and reporting - Strong ability in data and process modeling, working with complex datasets, and supporting decision-making and data architecture design - Analytical and problem-solving skills with the capacity to interpret and leverage data and AI capabilities for process improvements - Detail-oriented, solution-focused, with excellent communication skills, proactive approach, and understanding of Agile software methodology and modern CI/CD practices - Visionary mindset with a passion for innovation, cutting-edge technologies, and driving the adoption of AI and GenAI in the data and operations space About the Company: UBS is the world's largest and only truly global wealth manager with operations in four business divisions, including Global Wealth Management, Personal & Corporate Banking, Asset Management, and the Investment Bank. With a presence in major financial centers across more than 50 countries, UBS stands out for its global reach and expertise. At UBS, people are valued for their diverse skills, experiences, and backgrounds, driving ongoing success through a supportive team, growth opportunities, and flexible working options. The inclusive culture at UBS fosters collaboration and innovation, leveraging artificial intelligence (AI) to work smarter and more efficiently. (Note: The "Disclaimer / Policy Statements" section has been omitted from the Job Description as it does not directly relate to the job role.),
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

Software Engineer II, Associate

Chase- Candidate Experience page
experience2 to 6 Yrs
location
Maharashtra
skills
  • Python
  • NumPy
  • software engineering
  • agile development
  • AWS
  • statistical modeling
  • pandas
  • scikitlearn
  • quantitative modeling
  • MLAI models
  • cloud technologies
  • portfolio construction
  • trading technology
Job Description
As a Software Engineer II at JPMorgan Chase within the 55ip Quant team, you will have the opportunity to join an innovative team and contribute to the future of portfolio optimization using advanced quantitative, ML, and AI techniques. Your role will involve collaborating with quants, product, and technology professionals to design, implement, and optimize quantitative models for portfolio management. Your work will directly impact investment strategies and client outcomes. **Key Responsibilities:** - Develop, implement, and maintain quantitative models for portfolio optimization. - Write secure, high-quality production code in Python and review and debug code written by the team. - Design and integrate models into the simulation environment. - Enhance strategy performance through analytics review and code tuning. - Apply software engineering best practices throughout development. - Identify opportunities to automate remediation of recurring issues to enhance operational stability of software applications and systems. **Qualifications Required:** - Formal training or certification in software engineering concepts and a minimum of 2 years of applied experience. - Proficiency in Python and advanced experience with numerical libraries such as NumPy, pandas, and scikit-learn. - Hands-on practical experience in high-performance system design, application development, testing, and operational stability. - Hands-on experience in developing, testing, and deploying quantitative, ML/AI models. - Familiarity with working in an agile development environment. - Experience with cloud technologies, particularly AWS. - Proven experience working with large datasets. In addition to the required qualifications, capabilities, and skills, the following preferred qualifications are also desirable: - Experience working with a quantitative team. - Knowledge of statistical and quantitative modeling techniques, including practical application to real-world data analysis and problem-solving. - Previous experience in portfolio construction, implementation, or trading technology.,
ACTIVELY HIRING
posted 1 month ago

Talend ETL Developer

Team Geek Solutions
experience2 to 6 Yrs
location
Pune, Maharashtra
skills
  • data warehousing
  • troubleshooting
  • workflow management
  • etl
  • data modeling
  • sql
  • performance tuning
  • data governance
  • data integration
  • agile methodology
  • talend
  • etl processes
  • data profiling
  • analysis
  • sql proficiency
Job Description
Role Overview: As a Talend ETL Developer at Team Geek Solutions, your primary responsibility will be to design, develop, and maintain ETL processes using Talend. You will collaborate with business stakeholders to understand data requirements, develop SQL queries, and ensure data accuracy and quality through data profiling and analysis. Your role will also involve working with team members to design data warehouses, develop data transformation logic, and ensure compliance with data governance and security policies. You will play a key role in supporting data migration initiatives, managing workflow scheduling, and providing technical support and training to team members. Key Responsibilities: - Design, develop, and maintain ETL processes using Talend. - Implement data integration solutions to consolidate data from various systems. - Collaborate with business stakeholders to understand data requirements. - Develop and optimize SQL queries to extract and manipulate data. - Perform data profiling and analysis to ensure data accuracy and quality. - Monitor and troubleshoot ETL jobs to ensure smooth data flow. - Maintain documentation for ETL processes and data model designs. - Work with team members to design and enhance data warehouses. - Develop data transformation logic to meet business needs. - Ensure compliance with data governance and security policies. - Participate in code reviews and contribute to team knowledge sharing. - Support data migration initiatives during system upgrades. - Utilize Agile methodology for project management and delivery. - Manage workflow scheduling and execution of ETL tasks. - Provide technical support and training to team members as needed. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or related field. - Proven experience as an ETL Developer, mandatory with Talend. - Strong understanding of ETL frameworks and data integration principles. - Proficient in writing and troubleshooting SQL queries. - Experience in data modeling and database design. - Familiarity with data quality assessment methodologies. - Ability to analyze complex data sets and provide actionable insights. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills. - Ability to work collaboratively in a team-oriented environment. - Knowledge of data warehousing concepts and best practices. - Experience with Agile development methodologies is a plus. - Willingness to learn new technologies and methodologies. - Detail-oriented with a commitment to delivering high-quality solutions. - Ability to manage multiple tasks and deadlines effectively. - Experience with performance tuning and optimization of ETL jobs.,
ACTIVELY HIRING
posted 2 days ago
experience5 to 12 Yrs
location
Maharashtra
skills
  • Risk management
  • SDLC
  • Waterfall
  • Iterative methodologies
  • Agile methodologies
  • Project Management
  • Governance
  • Business architecture
  • Simplification
  • UAT
  • Automation
  • Process design
  • Database design
  • SQL
  • Python
  • R
  • MS Excel
  • MS PowerPoint
  • MS Word
  • MS Visio
  • Process reengineering
  • Controls
  • Developing solutions
  • Gen AI modeling tools
  • Building reporting frameworks
  • FRBs Supervisory Guidance on MRM SR 117
  • 1518
  • VBA skills
Job Description
Role Overview: Model Risk Management (MRM) is an integral part of Citi's Global Risk Management, responsible for providing Independent Oversight of models across the firm. As a Vice President joining the System Strategy and Oversight Team within the Model Risk Management Inventory & Initiative Management Group, you will play a crucial role in driving the reengineering of MRMS, the Citi Model Risk Management System, in alignment with Model Risk Management Policy and Procedures, as well as the overall Model Risk system strategy. Your responsibilities will include translating policies, procedures, and guidelines into process maps and concrete tasks, identifying dependencies, decision points, actors, opportunities for streamlining, and building system solutions to support these objectives. You will collaborate with various stakeholders within and outside Risk management to streamline, simplify, and implement model life cycle processes in MRMS. Additionally, you will be involved in authoring Business requirements, reengineering processes and system solutions for simplification and automation, liaising with IT partners, and partnering with validation and development groups to drive integration of metrics and documentation digitization, Gen AI POCs with MRMS target state. Key Responsibilities: - Drive reengineering of MRMS to align with Model Risk Management Policy and Procedures and overall Model Risk system strategy - Translate policies, procedures, and guidelines into process maps and concrete tasks - Identify dependencies, decision points, actors, and opportunities for streamlining - Collaborate with stakeholders to streamline, simplify, and implement model life cycle processes in MRMS - Author Business requirements and reengineer processes and system solutions for simplification and automation - Liaise with IT partners to build effective system solutions - Partner with validation and development groups to drive integration of metrics and documentation digitization, Gen AI POCs with MRMS target state Qualifications Required: - 12+ years of working experience with 5+ years in product development or equivalent role - Familiarity with O&T developing cycle and model risk management or similar - Experience in supporting cross-functional projects with project management and technology on system enhancements - Knowledge/experience with process design, database design, and high proficiency in SQL - Institutional knowledge/experience with Citi platforms/applications preferred - Strong interpersonal skills, project management skills, and experience with Python, R, other programming languages for implementing POCs desired - Expert-level knowledge of MS Excel for data analytics including VBA skills; MS PowerPoint for executive presentations; MS Word for business documentation; MS Visio for process flows and swim lanes - Bachelor's degree in finance, mathematics, computer science, or related field required, Master's Degree preferred (Note: No additional details about the company were mentioned in the provided Job Description.),
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 10 Yrs
location
Maharashtra
skills
  • Business Intelligence
  • Data Management
  • Data Engineering
  • Data Warehousing
  • Data Analysis
  • Data Modeling
  • ETL
  • Software Development
  • Agile Methodologies
  • Solutions Architecture
Job Description
Role Overview: As a Technical Manager specializing in Data Warehouse & Business Intelligence (BI), you will lead a dedicated team responsible for implementing enterprise analytics data engineering strategies. Your role involves collaborating with IT and various business units to construct, enhance, and maintain the Enterprise Data Warehouse. Additionally, you will be tasked with hiring, managing, coaching, and leading a team of Business Intelligence Engineers and Data Engineers. Key Responsibilities: - Understand business requirements and product stakeholders" needs to define data elements and structures for analytical capabilities. - Collaborate with business customers to implement solutions supporting analytical and reporting needs. - Manage projects related to data storage and transformation solutions, including data warehousing, data architecture, data processing, management, and analysis. - Participate in strategic and tactical planning discussions. - Work with Data Engineering and Software Development teams to capture and store key data points effectively. - Act as a liaison between business units and business intelligence for senior leaders. - Coordinate technical and business groups to deliver data warehouse and business intelligence capabilities. - Develop and execute a strategic plan and vision for data warehousing and business intelligence aligned with the business strategy and vision. - Establish standards for technology and business processes and ensure compliance. - Oversee data warehousing, data modeling, development, and application ownership for all Corporate BI. Qualification Required: - Bachelor's Degree in Computer Science, Engineering, Information Technology, or a closely related discipline is preferred. - 10+ years of experience in Information Technology leadership, system design, and application development. - Minimum 3 years of experience managing data-related technology implementation projects such as Data Warehousing and Business Intelligence. - 5+ years of experience in leading Data Warehouse & BI with people management responsibilities. - Strong oral and written communication skills for effective communication with technical and non-technical stakeholders. - Proven ability to meet deadlines, multitask, and prioritize workload effectively. - Excellent analytical skills. Additional Company Details: The Sr. Data Warehouse & BI Engineer in this role will supervise a team consisting of Analysts (BI), ETL Engineers, and Solutions Architects (Data Warehouse). Preferred qualifications include demonstrated industry leadership in technology fields, knowledge of software engineering best practices, and experience providing technical leadership and mentoring to other engineers in data engineering. Functional Competencies: - Strong technical skills in data analytics. - Emphasis on teamwork and collaboration. - Sound judgment and decision-making abilities. - Effective leadership and communication skills.,
ACTIVELY HIRING
posted 3 days ago
experience3 to 7 Yrs
location
Pune, Maharashtra
skills
  • Power BI
  • SQL queries
  • data modeling
  • data warehousing
  • TSQL
  • MDX
  • DAX
  • Microsoft Excel
  • OLAP
  • SQL Server Integration Services SSIS
  • database concepts
  • data gateway
  • data preparation projects
  • Microsoft SQL Server BI Stack
  • Power Query
  • ETL framework
  • Agile development methodologies
Job Description
As an Advisor, Data Analysis at Fiserv, your role will be crucial in formulating and delivering automated reports and dashboards using Power BI and other reporting tools. You will focus specifically on inquiry reporting metrics such as MTTR, Avg. Aging, and platform adoption. Your expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS) will be essential in understanding business requirements related to inquiry management and translating them into functional specifications for reporting applications. **Key Responsibilities:** - Formulate automated reports and dashboards using Power BI and other reporting tools, with a focus on inquiry reporting metrics. - Understand specific business requirements related to inquiry management and set functional specifications for reporting applications. - Utilize expertise in SQL queries, Power BI, and SSIS to gather and analyze data related to inquiries for reporting purposes. - Develop technical specifications from business needs and establish deadlines for work completion. - Design data models that transform raw data related to inquiries into insightful knowledge by understanding business requirements in the context of inquiry reporting metrics. - Create dynamic and eye-catching dashboards and reports using Power BI, highlighting key metrics and trends related to inquiries. - Implement row-level security on data and comprehend Power BI's application security layer models, ensuring data privacy and confidentiality related to inquiries. - Collaborate with cross-functional teams to integrate, alter, and connect data sources related to inquiries for business intelligence purposes. - Make necessary tactical and technological adjustments to enhance the current inquiry management reporting systems. - Troubleshoot and resolve issues related to data quality and reporting specifically focused on inquiries. - Communicate effectively with internal teams and client teams to explain requirements and deliver solutions related to inquiry reporting metrics. - Stay up to date with industry trends and advancements in Power BI and business intelligence for effective inquiry reporting. **Qualifications Required:** - Bachelor's degree in Computer Science, Information Systems, or a related field. - Proven experience as a Power BI Developer or similar role, with a specific focus on reporting related to inquiries or customer service metrics. - Expertise in SQL queries, Power BI, and SQL Server Integration Services (SSIS). - Excellent communication skills to effectively articulate requirements and collaborate with internal and client teams. - Strong analytical thinking skills for converting data related to inquiries into illuminating reports and insights. - Knowledge of data warehousing, data gateway, and data preparation projects. - Familiarity with the Microsoft SQL Server BI Stack, including SSIS, TSQL, Power Query, MDX, PowerBI, and DAX. - Detailed knowledge and understanding of database management systems, OLAP, and the ETL framework. - Proficiency in Microsoft Excel and other data analysis tools. - Ability to gather and analyze business requirements specific to inquiries and translate them into technical specifications. - Strong attention to detail and ability to QA and validate data for accuracy. - Ability to manage multiple projects and deadlines simultaneously. - Knowledge of Agile development methodologies is a plus. - Ability to learn and adapt to new technologies and tools quickly.,
ACTIVELY HIRING
posted 7 days ago
experience5 to 10 Yrs
location
Pune, Maharashtra
skills
  • Tableau
  • SQL
  • Data visualization
  • Advanced SQL
  • TSQL
  • Python
  • Scala
  • Data modeling
  • Data warehousing
  • Performance tuning
  • Stakeholder management
  • Data governance
  • Data security
  • Agile methodologies
  • JIRA
  • Databricks
  • BI development
  • Spark SQL
  • Cloud platforms
Job Description
As a Tableau Data Visualization Lead, you will be responsible for designing, developing, and maintaining interactive dashboards and visual analytics solutions that drive business insights and decision-making. **Job Responsibilities:** - Understanding and performing analysis of source systems data. - Estimation and delivery of adhoc and new Tableau MIS. - Designing and developing advanced & visually compelling Tableau dashboards and visualizations tailored to business needs in the Lending domain. Delivering in data visualization best practices, Tableau BI functionalities, and data storytelling. - Collaborating with business stakeholders to gather requirements and translate them into effective data visualizations. - Optimizing Tableau workbooks for performance and scalability. - Integrating and processing large datasets using Databricks (Spark-based platform). - Building and maintaining data transformation notebooks in Databricks to support reporting and analytics. - Ensuring data quality, consistency, and governance across all visualizations. - Working closely with data engineers, analysts, and business teams to deliver end-to-end BI solutions. - Providing technical leadership and mentorship to junior team members. - Staying updated on industry trends, optimizing performance, and ensuring the usability and scalability of Tableau solutions. - Proactively monitoring, data reconciliation, testing, proper documentation, and troubleshooting of MIS issues. - Ensuring developments follow standard coding patterns and are fully documented for audit submissions. - Ensuring compliance with data governance and data security/masking standards for data delivery. **Competencies for the job:** - Proven experience with Advanced SQL and data transformation for Visualization purposes. - Experience in SQL Query Development (T-SQL or Spark SQL). - 6 to 10 years of experience in data visualization and BI development. - Expert-level proficiency in Tableau including calculated fields, parameters, LOD expressions, Dashboard actions, etc. - 1 to 2 years of hands-on experience in Databricks including notebooks, Delta Lake, Spark SQL, and Python/Scala. - Strong SQL skills and experience working with relational databases. - Experience in the Lending domain with an understanding of loan lifecycle, credit risk, collections, and regulatory reporting. - Familiarity with data modeling, data warehousing concepts, and performance tuning. - Excellent communication and stakeholder management skills. - Experience in advance analytics data modeling and visualization scenarios. **Desired Experience:** - Bachelors or Masters degree in Computer Science, Information Systems, or related field. - Tableau certification is a plus. - Experience with cloud platforms (Azure, AWS, or GCP) is desirable. - Knowledge of Agile methodologies and tools like JIRA. - 5+ years of hands-on experience in Tableau Data Visualization and Data Modeling. - Excellent communication skills in written and verbal English. - Knowledge of core functions of a large NBFC/Bank including Sales, Credit, Collections, Finance, and Accounts.,
ACTIVELY HIRING
posted 1 day ago

Power BI Developer

iLink Digital
experience5 to 9 Yrs
location
Pune, Maharashtra
skills
  • Power BI
  • DAX
  • Data Modeling
  • SQL
  • ETL
  • Azure
  • Agile
Job Description
As a highly skilled Senior Power BI Developer, your role will involve designing, developing, and maintaining interactive dashboards, reports, and data models to drive actionable insights. Your expertise in Power BI, data modeling, and DAX will be crucial, along with experience in handling complex datasets. Exposure to the healthcare domain would be beneficial. Key Responsibilities: - Design, develop, and deploy Power BI dashboards, reports, and datasets for business stakeholders. - Translate business requirements into data models, KPIs, and visualizations. - Optimize Power BI solutions for performance, scalability, and usability. - Develop and maintain ETL processes to ensure data quality and consistency. - Collaborate with data engineers, business analysts, and domain experts to deliver insights. - Implement row-level security, governance, and compliance in Power BI reports. - Provide mentorship and guidance to junior developers in BI best practices. - Stay updated on the latest Power BI features and BI industry trends. Required Skills: - Strong expertise in Power BI (Power Query, DAX, Dataflows, Paginated Reports). - Solid experience in data modeling, SQL, and relational databases. - Proficiency in ETL tools and processes. - Strong analytical, problem-solving, and communication skills. - Experience with Azure (Data Factory, Synapse, SQL DB, Analysis Services) is a plus. - Ability to work in agile environments and manage multiple priorities. No additional details of the company are mentioned in the job description.,
ACTIVELY HIRING
posted 2 months ago
experience10 to 14 Yrs
location
Pune, Maharashtra
skills
  • Audit
  • C
  • C
  • Eclipse
  • IntelliJ
  • Threat Modeling
  • secure coding
  • AWS
  • GCP
  • collaboration
  • presentation
  • Extensive experience in software engineering best practices
  • Experience with Treasury
  • etc systems
  • their associated development languages
  • platforms
  • Expert level skills in Java
  • other languages andor technologies as necessary
  • associated IDEs Visual Studio
  • etc
  • Expert in software design principles
  • DevSecOps
  • CICD
  • modern development principles
  • Basic understanding of cybersecurity concepts such as encryption
  • hashing
  • certificates
  • PKIs
  • OWASP
  • Experience with Agile methodologies
  • cloud platforms Azure
  • Strong communication
  • decisionmaking skills
  • Experien
Job Description
Role Overview: As a System Architect for Enterprise Services - Treasury, Real Estate and Audit Value Stream, your main responsibility is to define and communicate a shared architectural vision that supports current and future business needs. You will collaborate with Agile teams to consider solutions, validate technology assumptions, evaluate alternatives, and converge on a solution. Additionally, you will work closely with Enterprise and Solution Architects to deliver solutions aligned with the broader architecture guardrails. Your role also includes business analysis planning and monitoring, requirements elicitation, management and communication, enterprise analysis, requirements analysis, and solution assessment and validation on large projects following the Eaton PROLaunch project management process standards. Key Responsibilities: - Lead and participate in the planning, definition, development, and high-level design of product solutions and architectural alternatives - Enable a continuous delivery pipeline through proper design guidelines - Define and communicate system interfaces, data structures, data storage, integrations, cybersecurity considerations, test automation concepts, and deployment approaches - Establish and communicate critical non-functional requirements - Consider economic boundaries in design decisions and operate accordingly - Participate in solution planning, incremental planning, product demos, and inspect-and-adapt events - Plan and develop the architectural runway to support desired business outcomes - Provide technical oversight and promote security, quality, and automation - Negotiate with the business to prioritize non-functional work (e.g., patching, platform upgrades) to reduce technical debt over time - Conduct requirements gathering activities such as brainstorming, focus groups, interviews, observation, prototyping, and workshops - Document and validate that the requirements meet stakeholders" needs - Ensure requirements fall within the solution scope and are aligned with business objectives and solution design - Communicate requirements in a format understandable to stakeholders, including solution designers and developers Qualifications: - Bachelor's degree from an accredited institution or equivalent level of education - 10+ years of experience in the software industry with a proven track record of shipping high-quality products Skills: - Extensive experience in software engineering best practices - Experience with Treasury, Audit, etc. systems and their associated development languages and platforms - Expert level skills in Java, C#, C++ (or other languages and/or technologies as necessary) and associated IDEs (Visual Studio, Eclipse, IntelliJ, etc.) - Expert in software design principles, DevSecOps, CI/CD, and modern development principles - Basic understanding of cybersecurity concepts such as encryption, hashing, certificates, PKIs, Threat Modeling, secure coding, and OWASP - Experience with Agile methodologies and cloud platforms (Azure, AWS, GCP) - Strong communication, collaboration, presentation, and decision-making skills - Experience working with diverse, global cultures, organizations, and teams,
ACTIVELY HIRING
posted 1 week ago
experience7 to 11 Yrs
location
Pune, Maharashtra
skills
  • Machine learning
  • C
  • Python
  • CI
  • CD
  • Agile
  • Scrum
  • Kanban
  • encryption
  • Threat Modeling
  • secure coding
  • OWASP
  • APIs
  • microservices
  • cloud services
  • Azure Open AI
  • LLMOps
  • MLOps
  • RAG
  • Model Tuning
  • DevSecOps
  • hashing
  • certificates
  • PKIs
Job Description
Role Overview: You will be responsible for defining and communicating a shared architectural vision for a complex product that supports current and future business needs. Collaborating with Agile teams, you will consider solutions, validate technology assumptions, evaluate alternatives, and converge on a solution. Working closely with Enterprise and Solution Architects, you will deliver solutions aligned with the broader architecture guardrails. Key Responsibilities: - Lead and participate in the planning, definition, development, and high-level design of complex product solutions and architectural alternatives. - Enable a continuous delivery pipeline through proper design guidelines. - Define and communicate system interfaces, data structures, data storage, integrations, cybersecurity considerations, test automation concepts, and deployment approaches. - Establish and communicate critical nonfunctional requirements. - Consider the impact economic boundaries have on design decisions and operate accordingly. - Participate in solution planning, incremental planning, product demos, and inspect and adapt events. - Collaborate closely with senior IT and business leaders, enterprise architects, and Cyber Security teams. - Oversee the implementation and maintenance of machine learning workflows and manage the lifecycle of large language models. - Design and optimize prompts to guide the behavior of language models and provide strategies to improve their accuracy. - Design and implement AI-driven solutions to accelerate application onboarding to IAM platforms such as Saviynt, Okta, Entra, and KeyFactor. - Develop and maintain CI/CD pipelines for IAM artifacts, including automated testing, validation, and deployment. - Train and fine-tune large language models (LLMs) to interpret REST API documentation and generate IAM integration artifacts (e.g., connectors, entitlement mappings). - Build intelligent agents that can translate API documentation into configurations compatible with IAM solutions such as Saviynt. - Use Robotic Process Automation (RPA) tools (e.g., Power Automate, Automation Anywhere) to automate manual IAM fulfillment tasks that cannot be handled via APIs or connectors. - Collaborate with architects and product managers to identify automation opportunities and translate business needs into technical solutions. - Build and maintain dashboards and KPIs to monitor automation performance, identify bottlenecks, and drive continuous improvement. - Ensure solutions are secure, scalable, and compliant with enterprise standards and regulatory requirements. - Stay current with emerging AI/ML and RPA technologies and recommend tools or frameworks that can enhance IAM automation. Qualifications: - Bachelor's degree from an accredited institution. - 7+ years of experience in the software industry. Skills: - Extensive experience utilizing best practices in software engineering. - Expert level understanding of Azure Open AI, LLMOps, MLOps, Machine learning, RAG, Model Tuning, and accuracy evaluation. - Extensive experience developing enterprise-grade, highly scalable, highly performant applications and/or distributed systems. - Expert level skills in C#, Python. - Expert level understanding of software design principles, design patterns, algorithms, data structures, and multithreading concepts. - Expert level understanding of DevSecOps, CI, and CD principles from code check-in through to deployment. - Extensive experience with modern software development principles including code management, test automation, APIs, microservices, and cloud services. - Solid understanding and proven usage of cybersecurity concepts such as encryption, hashing, certificates, PKIs, Threat Modeling, secure coding, and OWASP. - Extensive experience working with Agile, Scrum, or Kanban. - Experience with multiple cloud service providers such as Azure, AWS, and GCP. Certification is a plus. - Familiarity with models like LAMA, Mistral, etc. - Advanced verbal and written communication skills including the ability to explain and present technical concepts to all levels of the organization. - Comfortable working directly with both technical and non-technical audiences. - Good judgment, time management, and decision-making skills. - Ability to work collaboratively on a technical team. - Patience and empathy when working with and coaching developers and testers. - Extensive experience working with diverse, global cultures, organizations, and teams.,
ACTIVELY HIRING
posted 1 week ago
experience4 to 14 Yrs
location
Pune, Maharashtra
skills
  • SQL
  • Financial risk management
  • Jira
  • Capital Markets
  • Risk Management
  • Business process modeling
  • Credit Risk knowledge
  • LEF
  • SCCL
  • AIRB
  • FIRB models
  • Agile project execution
  • Basel regulation
  • Liquidity reporting
  • Credit exposure calculation
  • IT delivery methodologies
  • Business requirements documentation
Job Description
Job Description You will be responsible for leveraging your Credit Risk knowledge and hands-on experience in Credit Risk reporting like LEF, SCCL, along with a deep understanding of AIRB/FIRB models. Your role will involve IT BA experience, hands-on SQL, Agile project execution, and excellent presentation and communication skills. Key Responsibilities: - Utilize your expertise in Credit Risk models (LGD, PD, EAD) to enhance credit risk reporting and regulatory compliance - Collaborate with stakeholders, product owners, and technology teams to design and deliver innovative solutions - Demonstrate hands-on experience with LEF and SCCL reporting - Ensure a solid understanding of AIRB/FIRB frameworks - Execute Agile projects effectively - Utilize your strong domain expertise to bridge business and technology for leading financial institutions Qualifications Required: - Minimum Bachelor's degree in Business, Finance, Engineering, Math, or Sciences - Experience in financial risk management, credit risk, and credit risk models (LGD, PD, EAD) - Hands-on experience in LEF, ESN, and SCCL reporting - Familiarity with Basel regulation and Agile project methodologies - Knowledge of Capital Markets, Risk Management, and financial instruments - Ability to write clear and structured business requirements documents and user stories - Strong analytical, problem-solving, and communication skills If you are interested in this position, please share your updated resume with AthiAravinthkumar.Selvappandi@cognizant.com.,
ACTIVELY HIRING
posted 3 weeks ago
experience3 to 7 Yrs
location
Pune, Maharashtra
skills
  • System Analysis
  • Design
  • AGILE Methodologies
  • Customer Focus
  • Global Perspective
  • Solution Design
  • Oracle Demand Planning
  • Sales Operations Planning SOP Cloud
  • Oracle Planning Cloud
  • Enterpriselevel Business Applications
  • Business Insight
  • Manages Complexity
  • Manages Conflict
  • Optimizes Work Processes
  • Tech Savvy
  • Solution Configuration
  • Solution Functional Fit Analysis
  • Solution Modeling
  • Solution Validation Testing
  • Values Differences
Job Description
As an Application Specialist at Cummins Inc., your role involves providing comprehensive expertise in application functionality, configuration, and support for application software solutions. You will collaborate with various stakeholders to design, configure, implement, and enhance applications to meet business requirements effectively. Your responsibilities include: - Evaluating Oracle Demand Planning and Sales & Operations Planning (S&OP) Cloud application functionality to recommend solutions for improving business processes. - Collaborating with process owners, stakeholders, and enterprise architects to gather and review functional, architecture, and technical requirements. - Defining optimal application setup and technical solution designs while configuring Oracle Planning Cloud to meet demand planning and S&OP specifications. - Overseeing the development of integration customizations, workflows, and extensions for application enhancements. - Serving as a subject matter expert on Oracle Planning Cloud content, processes, and functionality. - Analyzing potential application solutions and recommending resolutions for functionality gaps. - Building strong relationships with vendors to enhance application functionality and address issues effectively. - Creating and managing functional specifications for projects to drive application development and solutions. In terms of qualifications, you should possess: - 5+ years of system analysis and design experience for enterprise-level business applications. - Implementation experience of 3+ years in Oracle Planning Cloud Solutions, with expertise in demand planning and S&OP. - Proficiency in the full software development lifecycle process and experience in delivering solutions through AGILE methodologies. Key competencies for this role include: - Business Insight - Customer Focus - Global Perspective - Manages Complexity - Manages Conflict - Optimizes Work Processes - Tech Savvy - Solution Configuration - Solution Design - Solution Functional Fit Analysis - Solution Modeling - Solution Validation Testing - Values Differences Qualifications required: - College, university, or equivalent degree in Computer Science, Information Technology, Business, or related field. - Certification in Oracle Planning & Collaboration Cloud Implementation Professional. - Certification in Oracle Demantra or Oracle Advanced Supply Chain Planning is advantageous. Please note that this job is in the Systems/Information Technology category at Cummins Inc., and it falls under the Hybrid role category. This is an Exempt - Experienced position with ReqID 2411888.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Pune, All India
skills
  • Data visualization
  • SQL
  • Data modeling
  • ETL tools
  • Power BI
  • Agile methodologies
  • Python
  • R
  • Tableau development
  • Data querying
  • KPIs
  • Business metrics
  • Cloud platforms
Job Description
As a Tableau Dashboard Developer / BI Developer, you will be responsible for designing, developing, and deploying interactive dashboards and reports to provide actionable business insights for clients across various domains. Your key responsibilities will include: - Designing, developing, and maintaining interactive dashboards and visualizations using Tableau Desktop and Tableau Server. - Collaborating with business analysts, data engineers, and stakeholders to gather requirements and translate them into effective visual solutions. - Creating calculated fields, parameters, and complex Tableau expressions to support business logic. - Optimizing Tableau dashboards for performance, scalability, and usability. - Implementing data blending, joins, filters, and actions to ensure a seamless user experience. - Connecting Tableau to various data sources such as SQL Server, Snowflake, Excel, and cloud-based data warehouses. - Working closely with the data engineering team to ensure data accuracy, integrity, and consistency. - Supporting deployment and version control of Tableau workbooks on Tableau Server / Tableau Online. - Developing and documenting best practices for dashboard design, color standards, and interactivity. - Performing data validation, UAT, and production support for client dashboards. You should possess: - 5+ years of experience in Tableau development and data visualization. - Strong expertise in SQL and data querying. - Experience connecting Tableau with databases like SQL Server, Oracle, Snowflake, Redshift, or BigQuery. - Proficiency in data modeling and transformation techniques. - Hands-on experience with Tableau Server / Tableau Online for publishing and permissions management. - Familiarity with ETL tools (Informatica, Alteryx, Talend, etc.). - Strong understanding of KPIs, business metrics, and storytelling with data. - Ability to handle multiple projects and deliver within tight deadlines. - Excellent communication and problem-solving skills. Good to have: - Knowledge of Power BI or other BI tools. - Experience with cloud platforms (AWS/Azure/GCP). - Familiarity with Agile methodologies. - Basic scripting knowledge (Python, R) for analytics integration. Educational Qualification: - Bachelors or Masters degree in Computer Science, Information Technology, Data Analytics, or related field. In addition, you can look forward to: - Opportunity to work on diverse client projects and global data solutions. - Support for Tableau certification and professional skill development. - Dynamic, collaborative work environment with leading technologies. - Competitive compensation and performance-based incentives. As a Tableau Dashboard Developer / BI Developer, you will be responsible for designing, developing, and deploying interactive dashboards and reports to provide actionable business insights for clients across various domains. Your key responsibilities will include: - Designing, developing, and maintaining interactive dashboards and visualizations using Tableau Desktop and Tableau Server. - Collaborating with business analysts, data engineers, and stakeholders to gather requirements and translate them into effective visual solutions. - Creating calculated fields, parameters, and complex Tableau expressions to support business logic. - Optimizing Tableau dashboards for performance, scalability, and usability. - Implementing data blending, joins, filters, and actions to ensure a seamless user experience. - Connecting Tableau to various data sources such as SQL Server, Snowflake, Excel, and cloud-based data warehouses. - Working closely with the data engineering team to ensure data accuracy, integrity, and consistency. - Supporting deployment and version control of Tableau workbooks on Tableau Server / Tableau Online. - Developing and documenting best practices for dashboard design, color standards, and interactivity. - Performing data validation, UAT, and production support for client dashboards. You should possess: - 5+ years of experience in Tableau development and data visualization. - Strong expertise in SQL and data querying. - Experience connecting Tableau with databases like SQL Server, Oracle, Snowflake, Redshift, or BigQuery. - Proficiency in data modeling and transformation techniques. - Hands-on experience with Tableau Server / Tableau Online for publishing and permissions management. - Familiarity with ETL tools (Informatica, Alteryx, Talend, etc.). - Strong understanding of KPIs, business metrics, and storytelling with data. - Ability to handle multiple projects and deliver within tight deadlines. - Excellent communication and problem-solving skills. Good to have: - Knowledge of Power BI or other BI tools. - Experience with cloud platforms (AWS/Azure/GCP). - Familiarity with Agile methodologies. - Basic scripting knowledge (Python, R) for analytics integration. Educational Qualification: - Bachelors or Masters degree in Computer Science, Information Technology, Data Analytics, or
ACTIVELY HIRING
posted 2 weeks ago
experience8 to 12 Yrs
location
Pune, Maharashtra
skills
  • Data Science
  • ML
  • Supply Chain
  • Manufacturing
  • Advanced Analytics
  • Machine Learning
  • Artificial Intelligence
  • Statistics
  • Optimization
  • Predictive Modeling
  • Big Data
  • Text Mining
  • Social Listening
  • Recommender Systems
  • Python
  • R
  • Project Management
  • Team Management
  • Healthcare
  • Pharma
  • Communication Skills
  • Critical Thinking
  • Problem Solving
  • AI
  • Claims Analytics
  • Agile Project Planning
Job Description
As a Data Science Manager in Supply Chain & Manufacturing at ZS, you will be part of a passionate team dedicated to transforming ideas into impact by leveraging data, science, technology, and human ingenuity to deliver better outcomes worldwide. You will work side-by-side with experts to develop custom solutions and technology products that drive value and results for clients. **Role Overview:** - Drive client engagements and lead the development and deployment of AI/ML/data science solutions - Lead large scale client delivery engagements in AI projects for the Supply Chain & Manufacturing Industry - Architect and develop predictive models, optimization algorithms, and statistical analyses - Mentor a team of data scientists, providing guidance and support for successful project execution - Drive business development through project delivery and client relationship management - Play a key role in advancing data science capabilities and offerings **Key Responsibilities:** - PhD degree in Computer Science, Statistics, or related discipline - 8+ years of relevant work experience; 10+ years without a PhD - Hands-on experience in applying advanced analytics leveraging AI and ML techniques in Supply Chain areas - Experience leading teams across multiple projects - Proficiency in big data, claims analytics, and advanced analytics concepts - Strong programming background in Python/R - Proven track record in business development - Agile project planning and management experience - Excellent communication, critical thinking, and problem-solving skills **Qualifications Required:** - PhD in Computer Science, Statistics, or related discipline - 8+ years of relevant post-collegiate work experience; 10+ years without a PhD - Hands-on experience in applying advanced analytics leveraging techniques of AI, ML to Supply Chain areas like Demand Planning, Forecasting, Supply Planning, Inventory Management, Logistics - Experience leading a team of data scientists, data engineers, domain experts, LLM engineers, DevOps across multiple projects - Experience with big data, claims analytics, advanced analytics concepts and algorithms (e.g. text mining, social listening, recommender systems, predictive modeling, Gen AI, proficiency with large language models etc.); - Relevant programming background (e.g. Python/R); - Proven track record of business development - Agile project planning and project management experience; - Team management and skill development experience; - Relevant domain knowledge preferred; (healthcare, pharma); - Excellent oral and written communication skills; - Strong attention to detail, with a research-focused mindset; - Excellent critical thinking and problem solving skills; By joining ZS, you will have access to a comprehensive total rewards package including health and well-being benefits, financial planning, annual leave, personal growth opportunities, and professional development programs. ZS fosters a collaborative culture that empowers individuals to thrive and grow within the organization. ZS believes in providing a flexible and connected work environment that allows employees to work from home and on-site at clients/ZS offices. This approach enables a balance between planned and spontaneous interactions that drive innovation and collaboration. ZS values diversity and inclusion, encouraging individuals from various backgrounds and experiences to contribute to the team's success. If you are eager to grow, contribute, and bring your unique self to the workplace, ZS welcomes your application. ZS is an equal opportunity employer committed to providing equal employment opportunities without regard to any class protected by applicable law. To apply, candidates must possess or be able to obtain work authorization for their intended country of employment. An online application with transcripts is required for consideration. For more information, visit www.zs.com.,
ACTIVELY HIRING
posted 3 weeks ago
experience3 to 7 Yrs
location
Maharashtra
skills
  • Revit
  • Dynamo
  • Power BI
  • NET
  • C
  • VB
  • Python
  • Agile methodologies
  • Grasshopper 3D
  • Rhino modeling
  • ACC
  • AIML implementation
  • ReactJS
  • Ladybug tools
  • Dynamo for BIM
  • Autodesk Construction Cloud ACC
Job Description
As a Computational Designer in the Architecture, Engineering, and Construction (AEC) industry, your role will involve utilizing scripting skills in Grasshopper 3D and Rhino modeling to automate architectural design processes and optimize workflows. You will collaborate with architects, engineers, and construction professionals to develop computational design solutions for complex projects. Additionally, you will be responsible for developing and maintaining custom scripts and tools for seamless integration between parametric modeling in Rhino/Grasshopper and BIM processes in Revit. Your ability to create custom scripts and tools using .NET (C#, VB) and Python to enhance functionality within Grasshopper and other IDEs will be crucial. It is important to stay updated on emerging technologies and trends in construction technology, with a focus on AI/ML implementation. You will actively participate in research and development efforts to explore innovative approaches to design and construction processes. Key Responsibilities: - Utilize scripting skills in Grasshopper 3D and Rhino modeling to automate architectural design processes and optimize workflows. - Collaborate with architects, engineers, and construction professionals to develop computational design solutions for complex projects. - Develop and maintain custom scripts and tools for seamless integration between parametric modeling in Rhino/Grasshopper and BIM processes in Revit. - Create custom scripts and tools using .NET (C#, VB) and Python to enhance functionality within Grasshopper and other IDEs. - Stay updated on emerging technologies and trends in construction technology, with a focus on AI/ML implementation. - Actively participate in research and development efforts to explore innovative approaches to design and construction processes. Qualifications Required: - Bachelors degree in architecture, Engineering Civil. Masters in computational design or related field preferred. - Proficiency in Grasshopper 3D and Rhino modeling software. - Proficiency in programming skills in C#, VB, and Python, with the ability to develop scripts and plugins for design automation. - Strong foundational knowledge of Revit and its interoperability with Grasshopper. - Strong problem-solving skills and the ability to work effectively in a collaborative team environment. - Passion for construction technology and a desire to innovate using AI/ML tools. - Excellent communication and presentation skills, with the ability to convey technical concepts to diverse audiences. Additional Company Details: We are looking for candidates who are able to thrive in a fast-paced, dynamic environment, effectively managing multiple priorities and adapting quickly to changing project requirements. You should be able to independently execute a select set of functional processes and persuade and obtain buy-in from direct stakeholders by clearly articulating a point of view and using data and logic. Moreover, you should always deliver on goals and projects on time, with high quality and cost efficiency. Displaying a high degree of emotional maturity and awareness of your own impact, as well as demonstrating the capability to work on self to enhance capabilities, is essential for success in this role.,
ACTIVELY HIRING
posted 1 week ago
experience10 to 14 Yrs
location
Pune, Maharashtra
skills
  • Data Engineering
  • Data Analysis
  • Power BI
  • Data Extraction
  • Data Transformation
  • Data Analysis
  • Data Visualization
  • Communication
  • Collaboration
  • Agile Development
  • Business Intelligence
  • Data Modeling
  • Data Warehousing
  • DAX
  • Data Governance
  • Data Infrastructure
  • UIUX
  • Problemsolving
  • ETL Processes
  • Power Query
  • User Testing
  • Feedback Gathering
Job Description
As a Data Engineer and Analyst in the global Engineering Planning and Reporting team at Siemens Energy, your role will involve designing, developing, and maintaining data infrastructure to support the digitalization demands of Compression Engineering and Product Management. You will collaborate with stakeholders to optimize data extraction, transformation, and analysis, develop Power BI reports, and ensure the usability and aesthetic quality of reports and dashboards. Your responsibilities will also include implementing data governance practices, maintaining documentation, and continuously improving the user experience through testing and feedback. Key Responsibilities: - Design, develop, and maintain data infrastructure for a global Engineering organization - Optimize data extraction, transformation, and analysis processes - Develop Power BI reports and ensure usability and aesthetic quality - Implement data governance practices and maintain documentation - Conduct user testing and gather feedback for continuous improvement Qualifications Required: - Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field - 10+ years of experience with business intelligence, data modeling, data warehousing, and ETL processes - Proven experience as a Power BI Developer, including DAX and Power Query - Solid understanding of UI/UX principles and best practices - Excellent problem-solving skills, attention to detail, effective communication, and collaboration skills - Preferred: familiarity with Agile development methodologies and software (e.g., Jira) - Preferred: familiarity with Engineering processes and systems (e.g., SAP, Teamcenter) Siemens Energy's Transformation of Industry division focuses on decarbonizing the industrial sector by enabling the transition to sustainable processes. With a strong industrial customer base, global network, diverse technologies, and integrated capabilities, the division plays a crucial role in driving Siemens Energy's mission forward. Siemens Energy is a global energy technology company with a commitment to developing sustainable and reliable energy systems. With over 100,000 employees in more than 90 countries, the company drives the energy transition and supports one sixth of the world's electricity generation. The company values diversity and inclusion, celebrating the unique contributions of individuals from over 130 nationalities. Siemens Energy provides benefits such as Medical Insurance coverage for all employees, including family members, and options for Meal Cards as part of the compensation package. The company's focus on innovation, decarbonization, and energy transformation creates opportunities for employees to make a difference in the energy sector.,
ACTIVELY HIRING
posted 2 months ago
experience7 to 11 Yrs
location
Pune, Maharashtra
skills
  • HANA
  • SAP Business Objects
  • MS Office
  • SAP HANA data modeling
  • SAP modules like Finance
  • ABAP programming
  • ABAP CDS Views
Job Description
As a Native HANA Modeling Professional at YASH Technologies, you will play a crucial role in designing and creating HANA data models to support various reporting requirements. You will actively participate in requirements gathering, translating high-level business requirements into Functional and Technical Specifications. Your role will involve leading the implementation of consolidated reporting requirements from multiple data sources and providing guidance on best practices to ensure efficient knowledge transfer within the team. Key Responsibilities: - Analyze, design, and create HANA data models for reporting needs - Lead requirement and design sessions, debugging, unit testing, and deployment of reports - Design and develop dashboard reports - Collaborate with team members to understand business requirements and create technical specifications - Prepare and maintain technical documentation - Lead data migration and archiving activities between HANA on-prem and cloud environments Qualifications Required: - Bachelors Degree in a technical field - 7+ years of experience with HANA - Strong background in SAP HANA data modeling, preferably with experience in SAP modules like Finance - Experience with SAP Business Objects tools and HANA view optimization techniques - Working knowledge of ABAP programming, including ABAP CDS Views - Intermediate proficiency in MS Office At YASH, you will have the opportunity to shape your career in a supportive and inclusive environment. Our Hyperlearning workplace is built on principles of flexible arrangements, agile collaboration, and continuous learning. Join us to be a part of a stable employment with a great atmosphere and ethical corporate culture, where your contributions will drive real positive changes in the virtual world.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Pune, All India
skills
  • SQL
  • Excel
  • Power BI
  • Tableau
  • Python
  • R
  • Agile
  • Scrum
  • Data modeling
  • Database design
  • Statistical analysis
  • Predictive modeling
  • ETL processes
  • CRM systems
  • ERP tools
Job Description
As a Business Analyst - Wealth Management & Capital Markets at NTT DATA in Pune, Maharashtra (IN-MH), India, your role will involve supporting data-driven decision-making across the organization by bridging the gap between business needs and data insights. You will collaborate with stakeholders to gather, document, and analyze business requirements, develop and maintain business process models, workflows, and system documentation, identify opportunities for process improvement and automation, and translate business needs into functional specifications and data requirements. Additionally, you will collect, clean, and validate large datasets from multiple sources, analyze data to identify trends and insights, develop dashboards and reports, and build and maintain key performance indicators and business metrics. You will also work closely with cross-functional teams to align on goals and outcomes, act as a liaison between technical teams and business units, and effectively communicate findings, recommendations, and risks to management. Key Responsibilities: - Collaborate with stakeholders to gather, document, and analyze business requirements - Develop and maintain business process models, workflows, and system documentation - Identify opportunities for process improvement and automation - Translate business needs into functional specifications and data requirements - Support implementation of new tools, systems, and reports - Collect, clean, and validate large datasets from multiple sources - Analyze data to identify trends, patterns, and insights - Develop dashboards and reports using tools like Power BI, Tableau, or Excel - Build and maintain key performance indicators and business metrics - Present insights and recommendations clearly to technical and non-technical audiences - Work closely with cross-functional teams to align on goals and outcomes - Act as a liaison between technical teams and business units - Communicate findings, recommendations, and risks effectively to management Qualifications Required: - Bachelor's degree in Business Administration, Data Science, Statistics, Computer Science, Economics, or related field - Proven experience (2-5 years) as a Business Analyst, Data Analyst, or similar role - Strong proficiency in SQL, Excel, and data visualization tools like Power BI, Tableau, or Looker - Experience with data analysis languages such as Python or R is a plus - Excellent analytical, problem-solving, and critical-thinking skills - Strong communication and documentation abilities - Familiarity with Agile and Scrum methodologies About NTT DATA: NTT DATA is a $30 billion trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. They are committed to helping clients innovate, optimize, and transform for long-term success. With diverse experts in more than 50 countries and a robust partner ecosystem, NTT DATA offers business and technology consulting, data and artificial intelligence services, industry solutions, application development, infrastructure management, and more. As a leading provider of digital and AI infrastructure, NTT DATA is dedicated to helping organizations and society move confidently into the digital future. As a Business Analyst - Wealth Management & Capital Markets at NTT DATA in Pune, Maharashtra (IN-MH), India, your role will involve supporting data-driven decision-making across the organization by bridging the gap between business needs and data insights. You will collaborate with stakeholders to gather, document, and analyze business requirements, develop and maintain business process models, workflows, and system documentation, identify opportunities for process improvement and automation, and translate business needs into functional specifications and data requirements. Additionally, you will collect, clean, and validate large datasets from multiple sources, analyze data to identify trends and insights, develop dashboards and reports, and build and maintain key performance indicators and business metrics. You will also work closely with cross-functional teams to align on goals and outcomes, act as a liaison between technical teams and business units, and effectively communicate findings, recommendations, and risks to management. Key Responsibilities: - Collaborate with stakeholders to gather, document, and analyze business requirements - Develop and maintain business process models, workflows, and system documentation - Identify opportunities for process improvement and automation - Translate business needs into functional specifications and data requirements - Support implementation of new tools, systems, and reports - Collect, clean, and validate large datasets from multiple sources - Analyze data to identify trends, patterns, and insights - Develop dashboards and reports using tools like Power BI, Tableau, or Excel - Build and maintain key performance indicators and business metrics - Present insights and recommendati
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter