core data jobs in chennai, Chennai

460 Core Data Jobs in Chennai

Toggle to save search
posted 2 weeks ago
experience3 to 7 Yrs
location
Chennai, All India
skills
  • Data Analytics
  • Business Intelligence
  • Anaplan
  • MIS
  • Linear Regression
  • Correlation
  • Probability
  • Logic
  • Change Management
  • Marketing
  • Sales
  • MS Excel
  • MS PowerPoint
  • SFDC
  • Market Data Analysis
  • Statistical Techniques
  • SalesBI
  • PowerBI
Job Description
Role Overview: As a Data Analytics/Business Intelligence professional at HITACHI ENERGY TECHNOLOGY SERVICES PRIVATE LIMITED in Chennai, Tamil Nadu, India, you will have the opportunity to support decision-making based on market data and collaborate on nurturing a data-driven culture. Your main responsibilities will include gathering, interpreting data, and analyzing results using analytics, research methodologies, and statistical techniques. You will play a key role in assessing BU and GPGs needs during the M&S yearly cycle. Key Responsibilities: - Be the person of reference in the GPG for market intelligence and analytical tools such as dashboards, excel-based files, SalesBI/ PowerBI /STRATOS/Anaplan/MIS. - Prepare, produce, and align GPG Mekko/STRATOS for Hitachi Energy Global processes like Global Target Setting (GTS) and Marketing Input Market Outputs (MIMO). - Collect data and provide business and competitor intelligence analyses related to market trends. Contribute to the definition of strategic plans and support their implementation. - Support key strategic initiatives of the GPG by collecting feedback and providing insights to improve hit-rate. Assist in the development of business analysis tools to support global, regional, and local marketing & sales teams. - Support Demand vs supply Orders Received budget mapping, Market Intelligence System (MIS) updates, Master Data checks and validation, and GPG product portfolio analysis. - Develop and implement data analyses, optimize statistical efficiency and quality, act as a liaison between HUBs, GPGs, and BU, evaluate key performance indicators (KPIs), and provide ongoing reports. - Extract and analyze data from sales/financial tools, provide ad-hoc analysis based on business needs, and develop and maintain reports and dashboards. - Ensure compliance with applicable external and internal regulations, procedures, and guidelines. Uphold Hitachi Energy's core values of safety and integrity. Qualifications Required: - Bachelor's degree in engineering, mathematics, economics or equivalent qualification. - Experience in collecting, organizing, analyzing, and disseminating information accurately. Ability to write reports and present findings professionally. - Proficiency in SFDC and PowerBI for developing reports and dashboards. Basic Intermediate statistical knowledge. - Intermediate change management skills, good marketing and sales experience, strong attention to detail, and ability to work under tight deadlines. - Proactive, self-motivated, passionate, with integrity and energy. Excellent MS Excel/PowerPoint/SharePoint skills. - Proficiency in both spoken & written English language. (Note: Any additional details of the company were not included in the provided job description) Role Overview: As a Data Analytics/Business Intelligence professional at HITACHI ENERGY TECHNOLOGY SERVICES PRIVATE LIMITED in Chennai, Tamil Nadu, India, you will have the opportunity to support decision-making based on market data and collaborate on nurturing a data-driven culture. Your main responsibilities will include gathering, interpreting data, and analyzing results using analytics, research methodologies, and statistical techniques. You will play a key role in assessing BU and GPGs needs during the M&S yearly cycle. Key Responsibilities: - Be the person of reference in the GPG for market intelligence and analytical tools such as dashboards, excel-based files, SalesBI/ PowerBI /STRATOS/Anaplan/MIS. - Prepare, produce, and align GPG Mekko/STRATOS for Hitachi Energy Global processes like Global Target Setting (GTS) and Marketing Input Market Outputs (MIMO). - Collect data and provide business and competitor intelligence analyses related to market trends. Contribute to the definition of strategic plans and support their implementation. - Support key strategic initiatives of the GPG by collecting feedback and providing insights to improve hit-rate. Assist in the development of business analysis tools to support global, regional, and local marketing & sales teams. - Support Demand vs supply Orders Received budget mapping, Market Intelligence System (MIS) updates, Master Data checks and validation, and GPG product portfolio analysis. - Develop and implement data analyses, optimize statistical efficiency and quality, act as a liaison between HUBs, GPGs, and BU, evaluate key performance indicators (KPIs), and provide ongoing reports. - Extract and analyze data from sales/financial tools, provide ad-hoc analysis based on business needs, and develop and maintain reports and dashboards. - Ensure compliance with applicable external and internal regulations, procedures, and guidelines. Uphold Hitachi Energy's core values of safety and integrity. Qualifications Required: - Bachelor's degree in engineering, mathematics, economics or equivalent qualification. - Experience in collecting, organizing, analyzing, and disseminating information accurately. Ability to write reports and present
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 3 weeks ago

Data Scientist I

Cambridge Mobile Telematics
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • Data Science
  • Computer Science
  • Statistics
  • Mathematics
  • Engineering
  • Machine Learning
  • Deep Learning
  • Python
  • NumPy
  • SQL
  • Hadoop
  • Spark
  • Pandas
  • scikitlearn
  • TensorFlow
  • Keras
  • Torch
  • Caffe
  • Big Data infrastructure
Job Description
You will be joining Cambridge Mobile Telematics (CMT), the world's largest telematics service provider, on a mission to make the world's roads and drivers safer. CMT's AI-driven platform, DriveWell Fusion, collects sensor data from various IoT devices and merges them with contextual data to provide insights into vehicle and driver behavior. Auto insurers, automakers, commercial mobility companies, and the public sector utilize CMT's platform to enhance risk assessment, safety, claims, and driver improvement programs. With headquarters in Cambridge, MA, and offices in Budapest, Chennai, Seattle, Tokyo, and Zagreb, CMT serves tens of millions of drivers globally every day. As a data science team member at CMT, your responsibilities will include: - Assisting with projects from data pre-processing to roll-out with minimal oversight - Writing code for debugging complex issues and creating new solutions for production systems - Supporting customers" requests related to production algorithms and deriving insights from data - Communicating and presenting data science work to stakeholders and collaborating across different teams - Completing any additional tasks as needed Qualifications for this role include: - Bachelor's degree or equivalent experience/certification in Data Science, Computer Science, Statistics, Mathematics, or Engineering - 2+ years of professional experience in Data Science - Understanding of data science principles, algorithms, and practices such as machine learning, deep learning, statistics, and probability - Proficiency in scripting languages like Python, Pandas, NumPy, scikit-learn, and SQL - Ability to write code, request data, and code from scratch - Experience with deep learning frameworks and knowledge of Big Data infrastructure are a plus - Strong team player and quick learner In terms of compensation and benefits, you can expect: - Fair and competitive salary based on skills and experience, with an annual performance bonus - Possibility of equity in the form of Restricted Stock Units (RSUs) - Comprehensive benefits including Medical, Dental, Vision, Life Insurance, 401k matching, disability, and parental leave - Unlimited Paid Time Off and flexible scheduling, including work from home options Additional perks of joining CMT: - Contributing to improving road safety globally - Participation in employee resource groups and wellness programs - Commitment to creating a positive and inclusive work environment for all employees At CMT, diversity and inclusion are core values, and they actively seek candidates from all backgrounds to join their team.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Chennai, All India
skills
  • Data Governance
  • Business Process Reengineering
  • Data Profiling
  • Data Analytics
  • Technical
  • Architectural Knowledge
  • Financial governance
  • oversight
  • control frameworks
  • Programme
  • Project Management Experience
  • Interpersonal
  • Influencing
  • Stakeholder Management
  • Operating Model Design
  • Financial Process Documentation
  • Knowledge of Financial Institutions
  • Financial Products
Job Description
As a candidate applying for the job at Standard Chartered Bank with Job ID 40142 located in Chennai, IN, in the area of Audit, Accounting & Finance, you will be responsible for the following: Role Overview: You will lead and maintain the infrastructure under GFS Data Management portfolio. This includes developing a roadmap, implementing data controls like reconciliation and validations, ensuring data quality, utilizing agile delivery model, possessing knowledge in finance & treasury domains, creating a Service Management plan, managing control changes, and ensuring continuous availability of the infrastructure to stakeholders. Key Responsibilities: - Display exemplary conduct and adhere to the Group's Values and Code of Conduct. - Take personal responsibility for embedding the highest standards of ethics, regulatory and business conduct. - Lead to achieve the outcomes set out in the Banks Conduct Principles. - Identify, escalate, mitigate, and resolve risk, conduct, and compliance matters effectively and collaboratively. - Serve as a Director of the Board and act in accordance with Articles of Association. - Other responsibilities include embedding Here for good and Groups brand and values, performing other assigned responsibilities under Group, Country, Business, or Functional policies and procedures. Qualifications: - Possess an academic degree from a well-recognized institution. - Other qualifications include Stakeholder Management, Effective Communication, Ability to work under pressure and adapt to change quickly, being highly detailed and result-oriented, and understanding Financial Reporting & related systems. About Standard Chartered: Standard Chartered is an international bank with a rich history of over 170 years. The bank aims to make a positive difference for clients, communities, and employees. Working at Standard Chartered means questioning the status quo, embracing challenges, and finding opportunities to grow and excel. The bank values diversity, inclusion, and advocates for purpose-driven careers. Celebrating unique talents and fostering an inclusive environment are core values of the organization. What we offer: - Core bank funding for retirement savings, medical and life insurance. - Time-off benefits including annual leave, parental/maternity leave, sabbatical, and volunteering leave. - Flexible working options based on home and office locations. - Proactive wellbeing support through digital platforms, development courses, and Employee Assistance Programme. - Continuous learning culture to support growth and opportunities for reskilling and upskilling. - Being part of an inclusive and values-driven organization that celebrates diversity and respects individual potential. As a candidate applying for the job at Standard Chartered Bank with Job ID 40142 located in Chennai, IN, in the area of Audit, Accounting & Finance, you will be responsible for the following: Role Overview: You will lead and maintain the infrastructure under GFS Data Management portfolio. This includes developing a roadmap, implementing data controls like reconciliation and validations, ensuring data quality, utilizing agile delivery model, possessing knowledge in finance & treasury domains, creating a Service Management plan, managing control changes, and ensuring continuous availability of the infrastructure to stakeholders. Key Responsibilities: - Display exemplary conduct and adhere to the Group's Values and Code of Conduct. - Take personal responsibility for embedding the highest standards of ethics, regulatory and business conduct. - Lead to achieve the outcomes set out in the Banks Conduct Principles. - Identify, escalate, mitigate, and resolve risk, conduct, and compliance matters effectively and collaboratively. - Serve as a Director of the Board and act in accordance with Articles of Association. - Other responsibilities include embedding Here for good and Groups brand and values, performing other assigned responsibilities under Group, Country, Business, or Functional policies and procedures. Qualifications: - Possess an academic degree from a well-recognized institution. - Other qualifications include Stakeholder Management, Effective Communication, Ability to work under pressure and adapt to change quickly, being highly detailed and result-oriented, and understanding Financial Reporting & related systems. About Standard Chartered: Standard Chartered is an international bank with a rich history of over 170 years. The bank aims to make a positive difference for clients, communities, and employees. Working at Standard Chartered means questioning the status quo, embracing challenges, and finding opportunities to grow and excel. The bank values diversity, inclusion, and advocates for purpose-driven careers. Celebrating unique talents and fostering an inclusive environment are core values of the organization. What we offer: - Core bank funding for retirement savings, medical and life insurance. - Time-off
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 2 months ago

Lead, NFR, Data Science

Standard Chartered India
experience10 to 14 Yrs
location
Chennai, Tamil Nadu
skills
  • SQL
  • Machine Learning
  • Business Analysis
  • Data Engineering
  • Data Analytics
  • Data Visualization
  • Stakeholder Management
  • Communication Skills
  • Python Programming
  • Generative AI
Job Description
You will be working in a techno-functional data programming role within the Risk&CFCC Data Store, Analysis & AI-backed product delivery area. Your responsibilities will include designing, developing, and maintaining Compliance and Financial Crime Data Store & Reporting, as well as providing AI-backed solutions. This role involves both new development projects and BAU support for legacy data products. It is essential to ensure accurate and thorough documentation of data products. - Lead data engineering/data analytics projects end-to-end, providing solutions and services addressing data delivery and accessibility - Conduct business analysis of Risk&CFCC use cases and collaborate with cross-functional teams for solution deliveries - Define data solutions by liaising with Risk Framework Owners and Reporting team - Collaborate with Technology Developers from the CIO team to implement strategic data solutions - Utilize Python and SQL programming for data solution development and debugging - Design AI Agents using Generative AI tools - Develop Presto SQL views to support data delivery and data visualization - Use tools like Jupyter Notebook, Excel Power Query, and Tableau for data programming and dashboard development - Identify, analyze, and manage data mapping and reference data - Conduct data investigations and reconciliation to support reporting needs and legacy BAU activities - Prepare solution documentation, data dictionaries, and assist in Data Quality Management - Experience in Risk - Prior experience in SQL & Python Programming - Knowledge of Machine Learning techniques and Generative AI - Strong business analysis and effective communication skills - Graduate or Post-graduate in Computer Science, Finance, or similar Quantitative Discipline - Minimum of 10 years of experience in a similar role, specifically in Risk - Proficiency in SQL & Python Data Programming, with skills in Jupyter Notebook - Experience in Data Product Built/Delivery - Good Stakeholder management & communication skills - Ability to establish good working relationships with individuals at all levels and work effectively in a team environment - Capacity to work under time pressure and adapt to changing requirements Standard Chartered is an international bank that is both nimble enough to act and big enough to make an impact. With a history spanning over 170 years, the bank is dedicated to making a positive difference for its clients, communities, and employees. Standard Chartered values diversity and inclusion, driving commerce and prosperity through unique diversity. The bank is committed to being here for good and encourages its employees to challenge the status quo, seek opportunities for growth, and continuously strive for improvement. Standard Chartered offers various benefits including core bank funding for retirement savings, medical and life insurance, flexible working options, proactive wellbeing support, continuous learning culture, and an inclusive and values-driven work environment that celebrates diversity and advocates inclusion.,
ACTIVELY HIRING
posted 6 days ago
experience9 to 15 Yrs
location
Chennai, Tamil Nadu
skills
  • Business Analysis
  • Requirement Gathering
  • BRD
  • FRD
  • MS SQL Server
  • OLAP
  • Agile Scrum
  • Azure DevOps
  • ETLELT Solutions
  • SQL querying
  • API Strategy
  • Azurehosted web applications
  • Data Warehouse
  • English Communication
  • Crossfunctional Collaboration
Job Description
As a Lead Technical Data Business Analyst, you will play a crucial role in supporting AI solutions, reporting, and analytics projects. Your responsibilities will include leading requirement gathering and business analysis activities, preparing detailed Business Requirements Documents (BRD) and Functional Requirements Documents (FRD), formulating user stories, managing backlog items, coordinating with product owners and developers, participating in timeline discussions, supporting AI feature development, and ensuring the quality of deliverables throughout the project lifecycle. Key Responsibilities: - Lead requirement gathering and business analysis activities across multiple project streams - Prepare and challenge detailed Business Requirements Documents (BRD) and Functional Requirements Documents (FRD) - Formulate clear user stories including descriptions and acceptance criteria - Manage and prioritize backlog items in collaboration with product owners - Coordinate closely with product owners and developers to align technical capabilities with business needs - Participate in timeline discussions to re-prioritize or simplify requirements to meet deadlines - Incorporate UX design considerations and actively participate in PO/UX calls for requirements analysis - Support AI feature development based on data analysis and querying using MS SQL Server, Power BI, and Data Warehouse technologies - Communicate fluently in English with U.S. and Canada-based product owners - Adapt to change requests and operate effectively in a startup-like environment - Work within Scrum processes and use Azure DevOps for project management - Lead client and team scope discussions ensuring clarity and progress - Monitor and ensure the quality of deliverables throughout the project lifecycle Qualifications Required: - 9 to 15 years of experience in Business Analysis - Proven leadership experience in managing business analysis activities and client communications - Strong expertise in requirement gathering, preparation of BRD and FRD documents - Technical knowledge of ETL/ELT solutions and experience with reporting tools - Proficient in API strategy and SQL querying with MS SQL Server - Experience with Azure-hosted web applications and analytics platforms like Data Warehouse and OLAP - Ability to translate complex business needs into technical specifications - Strong organizational skills with the ability to manage multiple streams simultaneously - Experience working in Agile Scrum environments using Azure DevOps - Excellent communication skills with fluent English (C1 Advanced level) - Capable of working independently and supporting multiple stakeholders - Ability to collaborate effectively with cross-functional teams including developers and product owners - Adaptability to change requests and dynamic project environments Nice to have: - Familiarity with .NET Core, ASP.NET MVC, and Kubernetes hosting platforms - Experience with AI and GenAI feature development - Knowledge of frontend technologies such as JavaScript/TypeScript and Angular - Understanding of UX design principles and participation in UX calls - Certification in business analysis or Agile methodologies,
ACTIVELY HIRING
posted 1 week ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Data processing
  • Data solutions
  • Programming languages
  • SQL
  • Reporting
  • Analysis
  • MIS
  • Analytical skills
  • Communication skills
  • Time management
  • SQL
  • Oracle
  • Apache Spark
  • Power BI
  • Business Objects
  • Crystal
  • Data Warehousing
  • SQL
  • Python
  • SAS
  • R
  • Spark
  • Apache Spark
  • Azure
  • AWS
  • Data governance
  • Quality assurance
  • Data Analyst
  • Azure environment
  • Banking data
  • Databricks
  • Data Product deployment
  • Problemsolving skills
  • Banking Retail
  • Business Wholesale Product Knowledge
  • Teamwork skills
  • Data Visualization tools
  • ETL concepts
  • Cloud platforms
  • Data modeling techniques
  • Compliance best practices
  • Computer science fundamentals
Job Description
As a Data Analyst at Allime Tech Solutions, you will play a crucial role in developing and managing data infrastructure within an Azure environment. Your expertise in working with Data Bricks for data processing and analysis will be essential for implementing and optimizing data solutions. Your responsibilities will include: - Developing data pipelines and managing data infrastructure within Azure - Working on end-to-end Data Product deployment - Collaborating with cross-functional teams for reporting, analysis, and MIS - Utilizing strong analytical and problem-solving skills to design effective data solutions - Utilizing data querying and processing skills using SQL, Oracle, Apache Spark, etc. - Leveraging data visualization tools such as Power BI, Business Objects, Crystal, etc. - Implementing data warehousing and ETL concepts - Ensuring data governance, quality assurance, and compliance best practices - Utilizing core computer science fundamentals relevant to data engineering Qualifications required for this role include: - Bachelor's or master's degree in computer science, Information Technology, or a related field - 7+ years of proven experience as a Data Analyst with 3 years of experience in banking data - Expertise in Databricks and SQL - Strong knowledge of banking retail, business & wholesale product - Certification in any Cloud Technology would be an advantage At Allime Tech Solutions, we are dedicated to empowering innovation through technology and connecting talent with opportunities. Our commitment to integrity and excellence drives us to provide tailored solutions for our clients. Join us in creating a future where everyone can thrive.,
ACTIVELY HIRING
posted 1 day ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Data Transformation
  • SQL
  • Python
  • DBT Core
  • DBT Macros
Job Description
As a skilled Developer with 3 to 6 years of experience, you will be responsible for developing and maintaining data transformation workflows using DBT Core and DBT Macros to optimize data processes. Your key responsibilities will include: - Collaborating with cross-functional teams to ensure seamless data integration and accurate data flow across systems. - Utilizing SQL to design, implement, and optimize complex queries for data extraction and analysis. - Implementing Python scripts to automate data processing tasks and improve efficiency. - Analyzing and interpreting data to provide actionable insights that support business decision-making. - Ensuring data quality and integrity by conducting thorough testing and validation of data transformations. - Participating in code reviews and providing constructive feedback to peers to maintain high coding standards. - Working closely with stakeholders to understand their data needs and deliver solutions that meet their requirements. - Staying updated with the latest industry trends and technologies to continuously improve data transformation processes. - Documenting data transformation processes and maintaining comprehensive records for future reference. - Troubleshooting and resolving data-related issues in a timely manner to minimize the impact on business operations. - Contributing to the development of best practices for data management and transformation within the organization. - Supporting the team in achieving project goals by actively participating in planning and execution phases. In order to excel in this role, you must possess the following qualifications: - Strong technical skills in DBT Core, DBT Macros, SQL, and Python essential for effective data transformation. - Proficiency in data integration techniques to ensure seamless data flow across platforms. - Excellent problem-solving abilities to address complex data challenges efficiently. - Good understanding of the Property & Casualty Insurance domain enhancing data-driven decision-making. - Strong communication skills in English, both written and spoken, to collaborate effectively with team members. - Proactive approach to learning and adapting to new technologies and methodologies. - Attention to detail to ensure data accuracy and reliability in all deliverables.,
ACTIVELY HIRING
posted 3 days ago

Analyst, Risk Data Strategy

Standard Chartered
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Python
  • R
  • SAS
  • MATLAB
  • Excel
  • SQL
  • Tableau
  • Banking Risk Management
  • Data sciences
Job Description
As a Risk Data Hub member at Standard Chartered in Chennai, IND, your primary responsibility will be Data Sourcing and Maintenance of risk data. This involves a set of data wrangling activities including gathering, transforming, mapping, building business views, and automating sourcing processes to enrich the quality of data for downstream purposes such as analytics, reporting, and visualization. Your key responsibilities include: - Ensuring appropriate BAU controls are established and deployed effectively in GBS - Working with partner teams to create and build next-generation data products and analytics solutions - Assessing problem statements and proposing potential solutions by advocating and enabling data-driven analytics - Contributing to/leading the development of prototype solutions and conducting thorough validation and data quality checks - Collaborating with stakeholders across all seniority levels to address urgent and strategic risk management needs - Coordinating with ERM teams to leverage existing results/analytics or processes - Providing expertise in migrating PoC projects to productionize in collaboration with various teams - Leading by example and demonstrating the bank's culture and values - Identifying, assessing, monitoring, and mitigating risks to the Group relevant to Data Quality - Ensuring adherence to Group BCBS 239 standard when appropriate - Providing timely and high-quality responses to both internal and external queries and requests Qualifications required: - Bachelor or Masters degree with technical degree preferred (statistics, mathematics, computer science, etc.) - Extensive programming experience in SQL, Tableau, Python, SAS, Excel Automation, etc. - Strong analytical mindset with excellent analytical, logical, reasoning, and problem-solving skills - Hands-on experience in data used for models development and best practices in data management - Previous experience in risk management, Stress Testing, or IFRS9 is an added advantage - Excellent written and oral communication skills at all levels and situations - Exposure to advanced machine learning methodologies is a plus Skills and Experience needed: - Banking Risk Management - Python, R, SAS, and/or MATLAB - Excel - Data sciences - SQL - Tableau About Standard Chartered: Standard Chartered is an international bank that has been making a positive difference for clients, communities, and employees for over 170 years. The bank values diversity, challenges the status quo, and seeks individuals who are committed to driving commerce and prosperity. Standard Chartered celebrates unique talents and advocates for inclusion, fostering an environment where everyone can realize their full potential. What we offer: - Core bank funding for retirement savings, medical and life insurance - Time-off benefits including annual leave, parental/maternity leave, sabbatical, and volunteering leave - Flexible working options based on home and office locations - Proactive wellbeing support including digital wellbeing platforms and mental health resources - Continuous learning culture with opportunities for growth and development - Inclusive and values-driven organization that embraces unique diversity If you are looking for a purpose-driven career in banking and want to contribute to a bank that makes a positive impact, Standard Chartered welcomes your unique talents and celebrates diversity.,
ACTIVELY HIRING
posted 3 days ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Statistical Programming
  • SAS
  • R
  • Python
  • QC
  • Problem Solving
  • CDISC
Job Description
Role Overview: You will be an Individual Contributor responsible for productive hands-on programming to support deliverables in the study/project/portfolio/standards team, specifically focusing on medium to high complex statistical programming deliverables to support assets and study teams. Initially, you will perform tasks with limited supervision, gradually transitioning to working independently. Your role will involve handling standards/study programming specific activities autonomously, including collaborating across stakeholders in different timezones. It is crucial to ensure adherence to high-quality programming standards in your daily work and complete tasks on time with quality and compliance to Pfizer's processes. You will work collaboratively with study teams and stakeholders such as clinicians and statisticians on milestones and deliverables. Active self-learning and delivering solutions in the statistical programming and data standards space will be key, along with contributing to SDSA initiatives globally and locally. Key Responsibilities: - Be accountable for your assigned work supporting the standards/study deliverables. - Contribute up to 80% of your time to programming deliverables assigned within the function's scope in SAS, R, or Python, and allocate 20% to self-learning, development, and growth. - Review, develop, and validate datasets, TFL as per CDISC aligned Pfizer Standards or Pfizer Data Standards for Study/Project/Portfolio (TA or Study Programming). - Explore the existing code base, execute/perform runs as required, and develop/modify as per needs and specifications suggested to the standards team as appropriate (Standards Programming). - Ensure appropriate documentation and QC throughout the study's lifespan for all programming deliverables across Standards, Programming, and Submissions. - Understand, develop, and review standard/study/project/portfolio requirements, specifications to gain a deeper understanding of expectations and programming requirements by collaborating with stakeholders. - Be knowledgeable in core safety standards as well as TA standards relevant to your project and lead the development of necessary standards for your study. - Exhibit routine and occasionally complex problem-solving skills, seeking direction when appropriate. - Regularly update leads on progress and time estimations to ensure smooth daily operations and accurate planning. - Advance your job knowledge to the next level by participating/contributing to opportunities both globally and locally. Qualifications Required: - Previous experience in statistical programming and data standards. - Proficiency in programming languages such as SAS, R, or Python. - Strong problem-solving skills and ability to work independently. - Excellent communication and collaboration skills. - Familiarity with CDISC standards is a plus. - Bachelor's or Master's degree in a related field is preferred. Note: Pfizer is an equal opportunity employer and complies with all applicable equal employment opportunity legislation in each jurisdiction in which it operates.,
ACTIVELY HIRING
posted 1 day ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • Ruby
  • Java
  • MongoDB
  • ElasticSearch
  • RabbitMQ
  • AWS
Job Description
As a Senior Core Infrastructure Engineer at Poshmark, you will play a crucial role in expanding the Core Platform and driving engagement and personalization initiatives. You will be part of a dynamic team managing a high-traffic social marketplace and leveraging cloud-based technologies to cater to the rapidly growing user base. - Develop applications and service delivery platforms on a distributed cloud architecture - Utilize technologies like NoSQL, Queueing, and Search infrastructure - Design and implement new product features with a data-oriented approach - Enhance core components of the Poshmark product and cloud platform - Ensure seamless e-commerce transactions by collaborating with SRE and platform teams To excel in this role, you should have: - Strong computer science foundation with expertise in data structures, algorithms, and software design - Over 4 years of relevant software engineering experience - Proficiency in Ruby/Java, MongoDB, ElasticSearch, RabbitMQ, and cloud providers (such as AWS) - Experience in designing and developing microservices, creating web-based APIs and REST APIs - Understanding scalability and performance issues across application tiers - Bonus points for experience with NoSQL data solutions - A Bachelor's or Master's degree in Computer Science or a related discipline is preferred Poshmark is an Equal Opportunity Employer dedicated to celebrating diversity and fostering an inclusive environment for all employees. Join the team to make a positive impact and contribute to the thriving organization. Apply now if this role excites you and motivates you to come to work every day.,
ACTIVELY HIRING
posted 1 week ago
experience6 to 10 Yrs
location
Chennai, Tamil Nadu
skills
  • Data Analytics
  • Data Engineering
  • Statistical Modeling
  • Data Collection
  • Data Management
  • Data Transformation
  • Data Pipelines
  • ETL Processes
Job Description
As a Data Analytics Lead Analyst at Citigroup, your role involves staying abreast of developments in the field of data analytics and contributing to the directional strategy of the business. You will be a recognized technical authority in a specific area and will need to have basic commercial awareness. Your communication and diplomacy skills will be essential as you guide and influence colleagues in other areas and external customers. Your work will have a significant impact on the area, ultimately affecting the overall performance of the sub-function/job family. **Responsibilities:** - Integrate subject matter and industry expertise within a defined area. - Contribute to data analytics standards for operational consistency. - Apply in-depth understanding of data analytics integration within the sub-function and align with the objectives of the entire function. - Resolve occasionally complex issues and provide detailed analysis for recommended actions. - Responsible for volume, quality, timeliness, and delivery of data science projects. - Assess risks in business decisions with a focus on compliance and safeguarding Citigroup. **Qualifications:** - 6-10 years of experience using codes for statistical modeling of large data sets. **Education:** - Bachelors/University degree or equivalent experience, potentially Masters degree. In addition to the core job description, the role also involves responsibilities in Data Engineering. This includes building data pipelines, collecting and managing data from various sources, and transforming raw data into usable formats using ETL processes. **Qualifications for Data Engineering:** - 6-10 years of experience using codes for statistical modeling of large data sets. **Education for Data Engineering:** - Bachelors/University degree in Computer Engineering, Computer Science, or Computer Applications. Please note that this job description provides a high-level overview of the duties performed, and other job-related tasks may be assigned as required. If you are a person with a disability and require accommodation to use the search tools or apply for a career opportunity at Citigroup, review the Accessibility at Citi information. Additionally, you can refer to Citigroup's EEO Policy Statement and the Know Your Rights poster for more details.,
ACTIVELY HIRING
posted 1 week ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • Data Models
  • Data Analysis
  • Activations
  • Communication skills
  • User management
  • Collaboration
  • Data models
  • Data integrity
  • Security
  • Compliance
  • Troubleshooting
  • Performance
  • Data quality
  • Salesforce Data Cloud
  • Salesforce platform
  • Data modelling
  • Integration patterns
  • Data Streams
  • Data Lake
  • Data Transforms
  • Segments
  • Calculated insights
  • Data cloud administration
  • Declarative tools
  • Salesforce certifications
  • Custom solutions development
  • Security settings
  • Data configuration
  • Integrations
  • Automation workflows
  • System behaviour
  • Salesforce releases
  • Best practices
Job Description
As a skilled Salesforce Data Cloud Developer, you will be responsible for designing and implementing scalable solutions on Salesforce Data Cloud. Your key responsibilities will include: - Having strong hands-on experience of 4+ years with Salesforce Data Cloud and core Salesforce platform - Demonstrating a solid understanding of data modelling and integration patterns - Understanding Data Streams, Data Lake, Data Models, Data Transforms, and Data Analysis - Working with segments, activations, and calculated insights - Performing data cloud administration tasks and utilizing declarative tools Additionally, you should possess Salesforce certifications in Salesforce DataCloud/CDP. Your role will involve collaborating with business teams to gather, analyze, and translate requirements into technical solutions. You will design, develop, and deploy custom solutions using Salesforce Data Cloud, ensuring data integrity, security, and compliance with governance standards. You will also be responsible for tasks such as user management, security settings, and data configuration. Your expertise will be crucial in building and maintaining data models, integrations, and automation workflows. Furthermore, you will troubleshoot and resolve issues related to performance, data quality, and system behavior, staying updated with Salesforce releases and recommending best practices.,
ACTIVELY HIRING
posted 1 week ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • Power BI
  • SQL
  • Data Analytics
  • Project coordination
  • Management
Job Description
Role Overview: As an Experienced Professional in the role, you will be responsible for multiple Specializations within the Sub-family. You will apply your practical knowledge and experience to work independently with general supervision. Your focus will be on developing the Installed base, improving data quality, and supporting business use of EAM with successful use cases. You will also ensure compliance with regulations and guidelines while upholding Hitachi Energy's core values of safety and integrity. Key Responsibilities: - Develop the Installed base and improve data quality - Leverage/utilize the tool landscape to manage the right assets/Site details - Support business use of EAM and promote successful business use cases - Responsible for Documentation, Reporting, and Analysis - Monitor IB data and ensure accurate Customer, Site, and Installation Information data in ServIS - Monitor and share progress on meeting IB Quality - Provide or coordinate IB Management related trainings to local users - Generate and customize reports in the tool with advanced knowledge in Excel and Microsoft Office power application tools - Ensure compliance with applicable external and internal regulations, procedures, and guidelines Qualifications Required: - Graduates from any discipline are eligible to apply - 4 to 8 years of experience required - Strong experience in Power BI, SQL, and Data Analytics - Proven ability to manage global stakeholders - Diverse industry experience and electrical industry background preferred - Strong learning mindset and proficiency in data analytics, Project co-ordination, and management - Proficiency in both spoken & written English language Please note: Qualified individuals with disabilities may request reasonable accommodations for accessibility assistance during the job application process by completing a general inquiry form on the Hitachi Energy website.,
ACTIVELY HIRING
posted 1 week ago
experience12 to 18 Yrs
location
Chennai, Tamil Nadu
skills
  • data modeling
  • Communication skills
  • Stakeholder management
  • Gen AIML data architectures
  • Vector Databases
  • RAG patterns
  • Big Data processing tools
  • Cloud architecture
  • Data Engineering lifecycle
  • API integration
  • Crossfunctional collaboration
Job Description
As a Principal Architect specializing in Generative AI & Data Platforms, your role involves architecting and operationalizing the end-to-end data ecosystem to support Generative AI, Machine Learning (ML), and advanced analytics. You will focus on building a robust, scalable, and compliant platform by leveraging modern cloud and Gen AI techniques like RAG to transform raw data into a reliable asset for data scientists, analysts, and business intelligence users. Your core responsibilities will include: - Defining and leading the technical roadmap for data architecture, optimizing data storage, indexing, and retrieval patterns for Gen AI and RAG pipelines. - Designing and managing highly available, performant, and cost-optimized data solutions across major cloud platforms (AWS, Azure, GCP) using Big Data technologies. - Creating advanced data models and semantic layers tailored to accelerate ML model training, feature engineering, and real-time analytical queries. - Implementing strict Data Governance, lineage, and privacy frameworks to ensure the ethical and secure use of data in all AI/ML initiatives. - Collaborating with MLOps and Data Science teams to streamline data flow from ingestion to deployment. - Architecting secure, high-speed data ingestion and egress mechanisms to integrate diverse data sources. - Acting as a subject matter expert on data architecture, effectively communicating technical vision to stakeholders. Your required skills and qualifications include: - Proven experience in designing scalable Gen AI/ML data architectures, Vector Databases, and RAG patterns. - Expertise in data modeling and Big Data processing tools such as Spark, Databricks, and Data Warehousing. - Deep hands-on experience architecting data solutions on major cloud platforms and native data services. - Strong understanding of the Data Engineering lifecycle, real-time streaming, and API integration techniques. - Exceptional communication, stakeholder management, and collaboration skills. - Bachelor's degree in Computer Science, Engineering, or a related field; an advanced degree is preferred.,
ACTIVELY HIRING
posted 1 week ago

Data Architect, Generative AI & Data Platform

Saaki Argus and Averil Consulting
experience12 to 18 Yrs
location
Chennai, Tamil Nadu
skills
  • data modeling
  • Communication skills
  • Stakeholder management
  • Gen AIML data architectures
  • Vector Databases
  • RAG patterns
  • Big Data processing tools
  • Cloud architecture
  • Data Engineering lifecycle
  • API integration
  • Crossfunctional collaboration
Job Description
As the Principal Architect, Generative AI & Data Platforms, you will play a strategic role in architecting and operationalizing the end-to-end data ecosystem that fuels Generative AI, Machine Learning (ML), and advanced analytics. Your focus will be on building a robust, scalable, and compliant platform utilizing modern cloud and Gen AI techniques such as RAG to transform raw data into a reliable asset for data scientists, analysts, and business intelligence users. **Core Responsibilities:** - Define and lead the technical roadmap for data architecture, optimizing data storage, indexing, and retrieval patterns for Gen AI and RAG pipelines. - Design and manage highly available, performant, and cost-optimized data solutions across major cloud platforms (AWS, Azure, GCP). - Create advanced data models and semantic layers tailored to accelerate ML model training, feature engineering, and real-time analytical queries. - Implement strict Data Governance, lineage, and privacy frameworks to ensure ethical and secure use of data in all AI/ML initiatives. - Collaborate with MLOps and Data Science teams to streamline data flow from ingestion to model training, versioning, and deployment. - Architect secure, high-speed data ingestion and egress mechanisms to integrate internal and external data sources. - Act as the subject matter expert on data architecture and effectively communicate technical vision to stakeholders. **Required Skills & Qualifications:** - Proven experience in designing and implementing scalable Gen AI/ML data architectures, Vector Databases, and RAG patterns. - Expertise in data modeling and Big Data processing tools such as Spark, Databricks, and data warehousing. - Deep hands-on experience architecting data solutions on major cloud platforms like AWS, Azure, or GCP. - Strong understanding of the Data Engineering lifecycle, real-time streaming, and API integration techniques. - Exceptional communication, stakeholder management, and collaboration skills. - Bachelor's degree in Computer Science, Engineering, or a related field; advanced degree preferred.,
ACTIVELY HIRING
posted 3 weeks ago
experience10 to 14 Yrs
location
Chennai, Tamil Nadu
skills
  • SQL
  • Metadata management
  • Business Analysis
  • Effective Communication
  • Risk Management
  • Python Programming
  • Data documentation
  • Data Quality Management
Job Description
As an IMR (Issue Management & Resolution) Lead within the Risk function at Standard Chartered, your role will involve coordinating and driving timely resolution of data quality and risk reporting issues across multiple domains. You will act as a central point of contact between Risk teams, Functional Units, Data Domains, and the Chief Data Office (CDO) to ensure consistent prioritization, ownership, and closure of issues. Your responsibilities will include: - Facilitating cross-functional governance forums - Tracking and monitoring progress of issue resolution - Ensuring adherence to agreed remediation timelines and data quality standards - Identifying systemic themes and escalating critical risks - Driving sustainable solutions by collaborating with business, technology, and control partners Key stakeholders you will engage with include: - Head of Data Management, Risk & CFCC - Global Head of Data, Risk & CFCC - Data Strategy, Risk & CFCC - Risk Reporting Leads - FFG Product Owners - TTO - CDO - Governance Team Qualifications required for this role include: - 10+ years of relevant experience in a similar role, specifically in Risk / CFCC - Experience in data management, Data Governance, Data Quality execution, and data delivery projects - SQL skills - Strong stakeholder management and communication skills - Strong RCA skills - Ability to build good working relationships with people at all levels and work in a team environment - Ability to work under time pressure and adapt to changing requirements - Overall work experience of at least 12 years Skills and experience that will be beneficial for this role are: - Experience in Risk - Prior experience in SQL & Python Programming - Hands-on experience in Metadata management, data documentation, and metadata tooling - Experience in the Banking Domain - Business Analysis and Effective Communication skills - Experience in BCBS239 or similar regulatory areas - Experience in Data Quality Management processes (DQ monitoring, issue management, and reporting) About Standard Chartered: Standard Chartered is an international bank with a rich history of over 170 years. The bank is committed to making a positive difference for its clients, communities, and employees. At Standard Chartered, you can expect to work in a challenging yet rewarding environment where you are encouraged to grow and innovate. The organization values diversity, inclusion, and continuous learning to support the growth and development of its employees. Standard Chartered offers various benefits including: - Core bank funding for retirement savings, medical and life insurance - Time-off benefits such as annual leave, parental/maternity leave, sabbatical, and volunteering leave - Flexible working options - Proactive wellbeing support - Continuous learning culture - Inclusive and values-driven organization If you are looking for a purpose-driven career in banking and want to work for an organization that values diversity and inclusion, Standard Chartered is the place for you. Join us in driving commerce and prosperity through our unique diversity and be a part of a team that advocates inclusion and innovation.,
ACTIVELY HIRING
posted 3 weeks ago

Lead Software Engineer (.Net Core)

Societe Generale Global Solution Centre
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • programming
  • scripting languages
  • designing
  • automation tools
  • automated testing
  • frontend programming
  • middleend programming
  • backend programming
  • CI practices
  • Selenium framework
  • data processing systems
Job Description
You will be responsible for gaining expertise in the application, strengthening your knowledge on both the functional and technical aspects of the application. Your key responsibilities will include: * Creating and maintaining test strategy, test scenario, and test case documents. * Ensuring timely communication with customers, stakeholders, and partners. * Assessing production improvement areas such as recurrent issues. * Providing suggestions for automating repetitive and regular production activities. * Performing bug-free release validations and producing metrics, tests, and defect reports. * Ensuring there are no severe 1, severe 2, or severe 3 defects that affect the functioning of the product in production. * Ability to perform level 2/level 3 production support. * Assisting the BA in preparing the Sprint demo presentation. Profile required: * Proven experience in programming/scripting languages. * Complete understanding of front, middle, and back-end programming concepts. * Proficiency in one programming language is an added advantage. * Ability to use designing and automation tools. * Basic knowledge of CI practices. * High learning agility. * Excellent team player. * Excellent communication skills. * Demonstrated knowledge of any frameworks. * Proven development experience in Python or Java. * Strong knowledge and experience in automated testing using the Selenium framework. * Demonstrated experience in designing, building, and maintaining data processing systems. Good to have: * Experience with Subject7. At Socit Gnrale, you will have the opportunity to create, dare, innovate, and take action, as these qualities are part of the company's DNA. You will be directly involved in growing in a stimulating and caring environment, feeling useful on a daily basis, and developing or strengthening your expertise. Employees at Socit Gnrale can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities. The company is committed to supporting the acceleration of its Group's ESG strategy by implementing ESG principles in all activities and policies, translating them into business activity, work environment, and responsible practices for environmental protection.,
ACTIVELY HIRING
posted 2 months ago
experience7 to 11 Yrs
location
Chennai, Tamil Nadu
skills
  • Software Engineering
  • Application Development
  • Programming
  • System Integration
  • SQL
  • ETL
  • DevOps
  • Azure
  • Database
  • Oracle
  • COSMOS
  • Microservices
  • Angular
  • Azure Data bricks
  • SSIS Packages
  • Azure Data Factory
  • Function Apps
  • Cloud Technologies
  • Azure Services
  • STRIIMs
  • NET core
Job Description
Role Overview: As a Software Engineer with 7-10 years of experience, your role will involve applying the principles of software engineering to design, develop, maintain, test, and evaluate computer software that provides business capabilities, solutions, and/or product suites. You will be responsible for systems life cycle management, ensuring delivery of technical solutions on time and within budget. Additionally, you will research and support the integration of emerging technologies and provide knowledge and support for applications development, integration, and maintenance. Your responsibilities will include developing program logic for new applications, analyzing and modifying logic in existing applications, analyzing requirements, testing, and integrating application components. You will also focus on web/internet applications specifically, using a variety of languages and platforms. Key Responsibilities: - Identify areas for improvement and develop innovative enhancements using available software development tools following design requirements of the customer. - Interpret internal/external business challenges and recommend integration of the appropriate systems, applications, and technology to provide a fully functional solution to a business problem. - Develop new features/functionality driven by program Increment (PI), including documenting features, obtaining approvals from Business and UPS IT Product Owners, story analysis, designing the required solution, coding, testing, non-functional requirements, and migration/deployment. - Develop new integration pipelines with SC360 - Databricks, Azure functions, Azure Data Factory, Azure DevOps, Cosmos DB, Oracle, Azure SQL, and SSIS Packages. - Work in alignment with business teams to support the development effort for all SC360 data-related PI items. - Develop fixes for defects and issues identified in the production environment. - Build POCs as needed to supplement the SC360 platform. - Develop and implement architectural changes as needed in the SC360 platform to increase efficiency, reduce cost, and monitor the platform. - Provide production support assistance as needed. - Ensure NFR compliance, including building according to UPS Coding Standards and Security Compliance. Qualification Required: - Bachelor's degree or higher in a related field - Strong communication skills (both oral and written) - Experience with Azure Databricks, SQL, ETL, SSIS Packages - Experience with Azure Data Factory, Function Apps, DevOps - Experience with Azure and other cloud technologies - Database experience in Oracle, SQL Server, and Cosmos - Knowledge of Azure Services such as key vault, app config, Blob storage, Redis cache, service bus, event grid, ADLS, App insight, etc. - Knowledge of STRIIMs - Agile life-cycle management - Vulnerability/Threat Analysis - Testing - Deployments across environments and segregation of duties Additional Company Details (if applicable): At UPS, equal opportunities, fair treatment, and an inclusive work environment are key values to which we are committed.,
ACTIVELY HIRING
posted 3 weeks ago

Core Software / Backend

Desirous Global Consulting
experience1 to 5 Yrs
location
Chennai, Tamil Nadu
skills
  • Python
  • Kafka
  • device communication
Job Description
Role Overview: You will be joining zMeds core software team to work on healthcare innovation. Your role will involve utilizing your expertise in Python, device communication, and Kafka to optimize data management. You will be responsible for tackling device driver development, server-side intricacies, and transport protocols to ensure robust healthcare solutions. Your commitment to problem-solving will position you at the forefront of building transformative healthcare technology. Key Responsibilities: - Utilize expertise in Python, device communication, and Kafka to optimize data management - Tackle device driver development, server-side intricacies, and transport protocols - Ensure robust healthcare solutions are developed through problem-solving Qualifications Required: - 1-3 years of experience in a similar role - Proficiency in Python, device communication, and Kafka - Strong problem-solving skills and commitment to innovation (Note: No additional details of the company are provided in the job description),
ACTIVELY HIRING
posted 1 month ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • Salesforce
Job Description
As a Software Engineer III- Core Engineer III at TekWissen in Chennai, your role involves designing, developing, testing, and maintaining software applications and products to meet customer needs. You will be engaged in the entire software development lifecycle, including designing software architecture, writing code, testing for quality, and deploying the software to meet customer requirements. Full-stack software engineering roles, where you can develop all components of software, including user interface and server-side, also fall within your responsibilities. Key Responsibilities: - Engage with customers to deeply understand their use-cases, pain points, and requirements, showcasing empathy and advocating for user-centric software solutions. - Solve complex problems by designing, developing, and delivering using various tools, languages, frameworks, methodologies (like agile), and technologies. - Assess the requirements of the software application or service and determine the most suitable technology stack, integration method, deployment strategy, etc. - Create high-level software architecture designs that outline the overall structure, components, and interfaces of the application. - Collaborate with cross-functional teams like product owners, designers, architects, etc. - Define and implement software test strategy, guidelines, policies, and processes in line with the organization's vision, industry regulations, and market best practices. - Work on continuously improving performance and optimizing the application and implement new technologies to maximize development efficiency. - Support security practices to safeguard user data, including encryption and anonymization. - Create user-friendly and interactive interfaces. - Develop and maintain back-end applications like APIs and microservices using server-side languages. - Evaluate and incorporate emerging technologies and capabilities to deliver solutions and monitor and participate in solutions for new stack layers, often involving industry collaboration. Skills Required: - Salesforce Experience Required: - Engineer III Exp: Pract. In 2 coding languages or adv. Pract. in 1 language. - 6+ years in IT. - 4+ years in development. Experience Preferred: - Auto cloud - Vlocity omnistudio experience. Education Required: - Bachelor's Degree. As an equal opportunity employer, TekWissen Group supports workforce diversity.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter