developer-analyst-jobs-in-chennai, Chennai

319 Developer Analyst Jobs in Chennai

Toggle to save search
posted 2 weeks ago

Abinitio Developer

CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
experience5 to 10 Yrs
location
Chennai, Bangalore+7

Bangalore, Noida, Hyderabad, Gurugram, Kolkata, Pune, Mumbai City, Delhi

skills
  • ab initio
  • unix shell scripting
  • sql
Job Description
Key Responsibilities Design, develop, and implement ETL processes using Ab Initio GDE (Graphical Development Environment). Build and maintain Ab Initio graphs, plans, and sandboxes for data extraction, transformation, and loading. Work with business teams to understand data integration requirements and deliver efficient solutions. Use Ab Initio EME for version control, dependency management, and metadata governance. Perform data profiling, data validation, and quality checks using Ab Initio components and tools. Optimize ETL workflows for performance, scalability, and maintainability. Implement robust error handling, restartability, and logging mechanisms. Collaborate with DBAs, data modelers, and analysts to ensure data accuracy and consistency. Schedule and monitor jobs using Ab Initio Control Center (AICC) or enterprise schedulers. Support production systems, troubleshoot issues, and perform root cause analysis. Required Technical Skills Strong hands-on experience in Ab Initio GDE, EME, Co>Operating System, and Control Center. Proficiency with Ab Initio components such as Input/Output, Transform, Partition, Sort, Join, Lookup, Rollup, Reformat, Scan, and Dedup Sort, along with error handling using Rejects, Error Tables, and Error Ports for robust ETL design. Expertise in ETL design, development, and deployment for large-scale data environments. Proficiency in SQL and relational databases such as Oracle, Teradata, DB2, or SQL Server. Experience with UNIX/Linux shell scripting for automation and workflow integration. Understanding of data warehousing concepts (star schema, snowflake schema, slowly changing dimensions). Strong performance tuning and debugging skills in Ab Initio. Familiarity with data quality, metadata management, and data lineage.  
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago
experience6 to 11 Yrs
location
Chennai, Bangalore+3

Bangalore, Navi Mumbai, Pune, Mumbai City

skills
  • sql
  • data
  • storage
  • git
  • factory
  • azure
  • databricks
  • engineer
  • pyspark
Job Description
We have an immediate opening for Azure Databricks with strong experience in Pyspark & SQL. Looking for the candidates who are available to join immediately or within 2 weeks. Interested candidates can share resumes to navyasree.kotha@deltassi.in   Role: ADB+ Pysark Job Type: Full-time (Client Payroll) Mode : WFO/ Hybrid Location : Bengaluru/ Chennai/ Pune/ Mumbai Experience : 6+ years Must have : Azure Databricks (ADB)+ Pyspark+SQL  Key Responsibilities: Develop and optimize ETL pipelines and data transformation workflows using Azure Databricks, PySpark, and SQL. Work with Azure Data Lake, Azure Synapse, and other Azure data services for data ingestion, storage, and analytics. Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver high-quality data solutions. Implement data cleansing, validation, and transformation logic in PySpark for structured and unstructured data. Write efficient SQL queries for data extraction, analysis, and reporting. Optimize Databricks jobs for performance and cost efficiency. Implement best practices in code management, version control (Git), and CI/CD pipelines for Databricks. Ensure compliance with data governance, security, and quality standards. Troubleshoot and resolve data processing issues in production environments. Required Skills and Experience: 3+ years of experience in data engineering or related roles. Strong hands-on experience with Azure Databricks and PySpark. Proficient in SQL query optimization, joins, window functions, and complex transformations. Experience with Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), and Azure Synapse Analytics. Familiarity with Delta Lake concepts (ACID transactions, time travel, etc.). Understanding of data modeling and ETL concepts. Proficient in Python for data manipulation and automation. Strong analytical and problem-solving skills. Experience with Agile/Scrum methodology and tools like JIRA or Azure DevOps.
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • Ab Initio
  • Talend
  • SQL
  • PLSQL
  • Database systems
  • Web services
  • Data warehousing
  • Version control tools
  • Analytical skills
  • Communication skills
  • ETL development
  • API integration
  • Cloud platforms
  • ETL best practices
  • Problemsolving skills
  • Teamwork skills
Job Description
As an Applications Development Intermediate Programmer Analyst, your role involves designing, developing, and optimizing ETL workflows and data integration solutions using Ab Initio or Talend. You will work closely with business and technology teams to ensure seamless data processing and transformation. Responsibilities: - Design, develop, and implement ETL pipelines using Ab Initio or Talend. - Work with structured, semi-structured, and unstructured data from multiple sources. - Optimize data processing and transformation workflows for efficiency and scalability. - Troubleshoot and resolve performance issues in ETL processes. - Collaborate with data architects, analysts, and business teams to define data requirements. - Ensure data quality, integrity, and governance standards are met. - Develop and maintain metadata and documentation for ETL processes. - Implement and manage job scheduling and automation tools. Qualifications: - 4-6 years of relevant experience working with Talend, Ab Initio (GDE, Express>IT, Conduct>IT) or Talend (Data Fabric, Open Studio, etc.). - Strong knowledge of SQL, PL/SQL, and database systems (Oracle, SQL Server, PostgreSQL, etc.). - Experience with ETL optimization, debugging, and performance tuning. - Experience in API integration, web services, and cloud platforms (AWS, Azure, GCP) is a plus. - Strong understanding of data warehousing concepts and ETL best practices. - Hands-on experience with version control tools (Git, SVN, etc.). - Strong analytical and problem-solving skills. - Excellent communication and teamwork skills. - Consistently demonstrates clear and concise written and verbal communication. - Demonstrated problem-solving and decision-making skills. - Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements. Preferred Qualifications: - Certifications in Ab Initio, Talend, or cloud technologies are a plus. - Experience with CI/CD pipelines for ETL deployment. Education: - Bachelors degree/University degree or equivalent experience.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 1 week ago
experience4 to 8 Yrs
location
Chennai, Tamil Nadu
skills
  • manual testing
  • automation testing
  • SQL
  • data validation
  • reporting
  • API testing
  • web applications
  • communication
  • documentation
  • Ecommerce
  • problemsolving
Job Description
As a QA Analyst / Data Migration Tester at our company, you will play a crucial role in validating data and functionality across large-scale Ecommerce or digital platforms. Your primary responsibility will be to ensure data integrity, completeness, and functional accuracy during migration and integration initiatives. Key Responsibilities: - Understand the data migration scope, mapping, and transformation logic between source and target systems. - Prepare and execute test plans, test cases, and validation scripts for functional, data, and integration testing. - Perform data validation, reconciliation, and consistency checks across product, customer, order, and pricing data. - Validate APIs, database transactions, and user journeys across web and back-office modules. - Support mock migration runs and cutover rehearsals by identifying data issues and reporting discrepancies. - Log, track, and verify defects using tools like JIRA, ensuring timely resolution and revalidation. - Collaborate closely with developers, business analysts, and data engineers for end-to-end testing coverage. - Document test evidence, results, and migration validation reports for audit and go-live readiness. Required Skills: - 3-8 years of hands-on experience in manual or automation testing for Ecommerce or enterprise web applications. - Strong knowledge of SQL and experience with data validation and reporting. - Familiarity with API testing tools (e.g., Postman, Swagger). - Strong understanding of Ecommerce data entities - Products, Categories, Customers, Orders, Pricing, and Inventory. - Strong analytical and problem-solving capability. - Excellent analytical, communication, and documentation skills. Good to Have: - Experience in data migration / ETL testing projects. - Understanding of cloud-based Ecommerce platforms or microservices architectures. - Exposure to automation frameworks (Selenium, Karate, REST Assured, etc.) - Basic knowledge of integration flows between Ecommerce, ERP, and OMS systems. Education: - Bachelors degree in Computer Science, Information Technology, or equivalent field.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 9 Yrs
location
Chennai, All India
skills
  • ServiceNow
  • BRD
  • FRD
  • User Stories
  • ITSM
  • CSM
  • ITIL
  • Stakeholder Management
  • Communication
  • Documentation
  • Business Analyst
  • ITOM
  • ITBM
  • HRSD
  • Analytical Thinking
Job Description
As a ServiceNow Business Analyst, your role will involve supporting ongoing ServiceNow initiatives by managing end-to-end project delivery, preparing detailed documentation, and leading client interactions. This is a remote contractual position for an initial 3 months, extendable based on project needs and performance. Key Responsibilities: - Act as the liaison between business stakeholders and ServiceNow technical/development teams. - Gather, analyze, and document business requirements for ServiceNow modules and workflows. - Prepare and maintain documentation including BRD, FRD, Use Cases, User Stories, and Test Scenarios. - Participate in end-to-end project delivery, from requirements gathering to UAT. - Conduct workshops, client meetings, and requirement validation sessions. - Translate business requirements into functional and technical specifications aligned with ServiceNow capabilities. - Collaborate with developers and architects to ensure business needs are met effectively. - Support UAT, training, and post-implementation activities. - Manage stakeholder expectations and ensure timely project deliverables. Required Skills And Experience: - 5 to 9 years of total experience, with at least 3+ years as a Business Analyst in ServiceNow projects. - Strong understanding of ServiceNow modules like ITSM, ITOM, ITBM, HRSD, or CSM. - Proven experience in end-to-end ServiceNow implementation projects. - Hands-on experience in preparing BRD, FRD, User Stories, and Process Flow Diagrams. - Excellent client-facing and stakeholder management skills. - Strong communication, analytical thinking, and documentation abilities. - Ability to work effectively in a remote and cross-functional team setup. - ITIL certification or understanding of ITIL processes is advantageous. - ServiceNow certifications (Business Analyst, CSA, or equivalent). - Bachelor's degree in Computer Science, Information Technology, or related field. - Certifications in ServiceNow or Business Analysis (CBAP, CCBA) are desirable. If you need more details about the company, please provide additional information from the job description. As a ServiceNow Business Analyst, your role will involve supporting ongoing ServiceNow initiatives by managing end-to-end project delivery, preparing detailed documentation, and leading client interactions. This is a remote contractual position for an initial 3 months, extendable based on project needs and performance. Key Responsibilities: - Act as the liaison between business stakeholders and ServiceNow technical/development teams. - Gather, analyze, and document business requirements for ServiceNow modules and workflows. - Prepare and maintain documentation including BRD, FRD, Use Cases, User Stories, and Test Scenarios. - Participate in end-to-end project delivery, from requirements gathering to UAT. - Conduct workshops, client meetings, and requirement validation sessions. - Translate business requirements into functional and technical specifications aligned with ServiceNow capabilities. - Collaborate with developers and architects to ensure business needs are met effectively. - Support UAT, training, and post-implementation activities. - Manage stakeholder expectations and ensure timely project deliverables. Required Skills And Experience: - 5 to 9 years of total experience, with at least 3+ years as a Business Analyst in ServiceNow projects. - Strong understanding of ServiceNow modules like ITSM, ITOM, ITBM, HRSD, or CSM. - Proven experience in end-to-end ServiceNow implementation projects. - Hands-on experience in preparing BRD, FRD, User Stories, and Process Flow Diagrams. - Excellent client-facing and stakeholder management skills. - Strong communication, analytical thinking, and documentation abilities. - Ability to work effectively in a remote and cross-functional team setup. - ITIL certification or understanding of ITIL processes is advantageous. - ServiceNow certifications (Business Analyst, CSA, or equivalent). - Bachelor's degree in Computer Science, Information Technology, or related field. - Certifications in ServiceNow or Business Analysis (CBAP, CCBA) are desirable. If you need more details about the company, please provide additional information from the job description.
ACTIVELY HIRING
posted 1 week ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Mule ESB
  • SOAP web services
  • Jenkins
  • Maven
  • Git
  • XML
  • JSON
  • Docker
  • Kubernetes
  • Mulesoft Anypoint Platform
  • Anypoint Studio
  • DataWeave
  • RESTful web services
  • RAML
  • SwaggerOpenAPI
  • CICD practices
  • Anypoint CICD
  • CSV
  • CloudHub
Job Description
As a Senior Mulesoft Developer at EY, you will have the opportunity to lead integration initiatives and drive innovation within the integration practices. **Key Responsibilities:** - Lead the design and development of enterprise-level integration solutions using Mulesoft Anypoint Platform. - Serve as a subject matter expert in Mulesoft technologies and provide guidance to the development team. - Collaborate with project managers, business analysts, and other stakeholders to define integration requirements and ensure alignment with business goals. - Develop high-quality, reusable APIs and integration flows using Mule ESB and Anypoint Studio. - Create and maintain complex data transformations and mappings using DataWeave. - Oversee end-to-end integration solutions from design through to deployment and post-production support. - Implement best practices for API-led connectivity and ensure compliance with security and governance policies. - Conduct code reviews, enforce coding standards, and foster a culture of continuous improvement. - Optimize application performance, scalability, and security. - Lead the troubleshooting and resolution of integration issues. - Mentor junior developers and contribute to their professional growth. - Document integration processes, design decisions, and operational manuals. - Stay abreast of the latest Mulesoft features and updates, and recommend adoption of new technologies where appropriate. **Qualifications:** - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - 3-5 years of hands-on experience with Mulesoft Anypoint Platform. - Advanced knowledge of Mule 3.x and/or Mule 4.x, Anypoint Studio, API Designer, and Mulesoft's API-led connectivity approach. - Proven track record of developing complex integration solutions in a large-scale environment. - Strong experience with RESTful and SOAP web services, as well as RAML and Swagger/OpenAPI specifications. - In-depth understanding of integration patterns, SOA, ESB architecture, and microservices. - Expertise in CI/CD practices and tools such as Jenkins, Maven, Git, and Anypoint CI/CD. - Proficiency in various data formats and transformation techniques (XML, JSON, CSV, etc.). - Experience with both CloudHub and on-premises deployments, as well as containerization technologies like Docker and Kubernetes. - Exceptional problem-solving, analytical, and technical troubleshooting skills. - Excellent leadership, communication, and interpersonal skills. - Mulesoft Certified Integration Architect or Mulesoft Certified Platform Architect is highly desirable. Join EY and contribute to building a better working world, where diverse teams across over 150 countries provide trust through assurance and help clients grow, transform, and operate.,
ACTIVELY HIRING
posted 1 week ago
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • ETL
  • Data Processing
  • Data Transformation
  • Python
  • Spark
  • Hadoop
  • Data Engineering
  • SQL
Job Description
As an ETL Developer at our company, you will be responsible for designing, implementing, and optimizing distributed data processing jobs to handle large-scale data in Hadoop Distributed File System(HDFS) using Apache Spark and Python. Your role will require a deep understanding of data engineering principles, proficiency in Python, and hands-on experience with Spark and Hadoop ecosystems. You will collaborate with data engineers, analysts, and business stakeholders to process, transform, and drive insights and data-driven decisions. **Responsibilities:** - Design and Implement of Spark applications to process and transform large datasets in HDFS. - Develop ETL Pipelines in Spark using Python for data ingestion, cleaning, aggregation, and transformations. - Optimize Spark jobs for efficiency, reducing run time and resource usage. - Fine-tune memory management, caching, and partitioning strategies for optimal performance. - Load data from different sources into HDFS, ensuring data accuracy and integrity. - Integrate Spark Applications with Hadoop frameworks like Hive, Sqoop, etc. - Troubleshoot and debug Spark Job failures, monitor job logs, and Spark UI to identify issues. **Qualifications:** - 2-5 years of relevant experience. - Experience in programming/debugging used in business applications. - Working knowledge of industry practice and standards. - Comprehensive knowledge of specific business areas for application development. - Working knowledge of program languages. - Consistently demonstrates clear and concise written and verbal communication. - Expertise in handling complex large-scale Warehouse environments. - Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities. **Education:** - Bachelor's degree in a quantitative field (such as Engineering, Computer Science) or equivalent experience. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.,
ACTIVELY HIRING
posted 1 month ago
experience3 to 7 Yrs
location
Chennai, Tamil Nadu
skills
  • Apex programming
  • Workflow
  • Access management
  • Salesforce configuration
  • Service cloud
  • Salesforce architecture
  • Salesforce native data base concepts
  • Business process automation tools
  • Process builder
  • Lightning flows
  • Salesforce user management
  • Licensing concepts
  • Synchronous operations in Apex programming
  • Asynchronous operations in Apex programming
  • Salesforce governor limits
  • Development
  • debugging with Salesforce debug tools
  • Troubleshooting technologically challenging customer issues
  • Advanced debugging techniques
  • SOQL queries
Job Description
Job Description: As a Salesforce Developer Analyst, you will be responsible for configuring Salesforce, developing Apex code, and utilizing various Salesforce tools to streamline business processes. You will work with Salesforce architecture, data base concepts, and user management to ensure smooth operations. Your expertise in Salesforce governor limits and debug tools will be crucial in troubleshooting customer issues and optimizing system performance. Additionally, your proficiency in SOQL queries will aid in efficient data retrieval and manipulation. Key Responsibilities: - Configure Salesforce settings and objects to meet business requirements - Develop Apex code for custom functionalities and integrations - Utilize workflow, process builder, and lightning flows for business process automation - Manage user access and licenses within Salesforce platform - Troubleshoot and resolve complex technical issues using debug tools - Optimize system performance by adhering to Salesforce best practices and governor limits Qualifications Required: - 3-5 years of experience in Salesforce development - Salesforce Admin and/or Salesforce Developer (PD1) certifications - Strong knowledge of Salesforce architecture and configuration - Proficiency in Apex programming and SOQL queries - Experience with business process automation tools - Ability to troubleshoot and debug technically challenging issues - Immediate joiners preferred,
ACTIVELY HIRING
posted 2 days ago

ODI Developers

Codeboard Technology Private Limited
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • Oracle Data Integrator
  • ETL
  • Data Integration
  • SQL
  • PLSQL
  • Oracle Database
Job Description
As an ODI Developer, your role involves designing, developing, and maintaining data integration solutions using Oracle Data Integrator (ODI) technology. You will be primarily focused on extracting, transforming, and loading (ETL) data from various sources into target data repositories. Collaboration with business analysts, data architects, and database administrators will be essential to understand data requirements and implement efficient data integration processes. The ideal candidate for this position should have a strong background in data integration concepts, experience with ODI tools, and expertise in Oracle databases. Key Responsibilities: - Design and implement ETL processes using Oracle Data Integrator to extract, transform, and load data from source systems to target databases or data warehouses. - Create data mapping specifications to ensure accurate data transformation and alignment with business rules. - Utilize your proven experience as an ODI Developer or in a similar ETL development role. - Demonstrate in-depth knowledge of Oracle Data Integrator (ODI) and its components. - Apply a strong understanding of data integration concepts and ETL best practices. - Showcase proficiency in writing SQL queries and PL/SQL scripts for data manipulation and validation. - Have familiarity with Oracle database technologies and performance tuning techniques. Qualifications Required: - Strong background in data integration concepts. - Experience with ODI tools. - Expertise in Oracle databases.,
ACTIVELY HIRING
posted 2 months ago

Credit Analyst

ODH DEVELOPERS PRIVATE LIMITED
ODH DEVELOPERS PRIVATE LIMITED
experience14 to 22 Yrs
location
Chennai, Singapore+18

Singapore, Oman, Muzzafarpur, Zimbabwe, Saudi Arabia, Junagarh, Bangalore, Tanzania, Kuwait, Noida, Janjgir Champa, Philippines, Ghaziabad, Sudan, Hyderabad, Kolkata, Norway, Sweden, Mumbai City

skills
  • scheduling
  • communication
  • budgeting
  • problem
  • management
  • leadership
  • time
  • solving
  • skills
  • organizational
  • project
Job Description
credit analyst job description with a concise paragraph or list of bulleted items designed to sell your company, agency, or institution to applicants. You might mention how crucial credit analysis is to your companys success and how this role is viewed as a steppingstone toward more prominent finance positions within your workplace. If your company values work-life balance, promoting from within, incentive-based compensation, or diversity, equity, and inclusion, be sure to mention this in your post. Credit Analyst Job Responsibilities: Gathers and analyzes loan applicants financial data to evaluate risk. Assesses creditworthiness of individuals, companies, and institutions. Collaborates with other financial experts to approve or deny loans. Makes recommendations about whether to increase, adjust, extend, or close lines of credit. Undertakes risk analysis using regional, sector-specific, environmental, and other financial data. Prepares and presents credit reports. Completes quality assurance reviews. Gauges market trends. Monitors and adheres to collateral compliance covenants. Ensures that all loans are made in adherence with federal, state, and local financial regulations. Analyzes data to verify information and uncover fraud. Helps to update and improve credit rating criteria.
posted 2 months ago

Junior Financial Analyst

ODH DEVELOPERS PRIVATE LIMITED
ODH DEVELOPERS PRIVATE LIMITED
experience14 to 22 Yrs
location
Chennai, Singapore+16

Singapore, Oman, Muzzafarpur, Saudi Arabia, Raigarh, Tanzania, Bangalore, Noida, Thailand, Sudan, Hyderabad, Gurugram, Kolkata, Norway, Sweden, Mumbai City, Delhi

skills
  • time
  • budgeting
  • leadership
  • management
  • communication
  • problem
  • scheduling
  • organizational
  • skills
  • project
  • solving
Job Description
As a Junior Financial Analyst you will play a crucial role in supporting our financial analysis and reporting needs. You will monitor key financial metrics, contribute to business metric reporting, and provide insightful analyses to assist in decision-making for senior management. Your work will involve consolidating and analyzing financial data, assisting with financial planning, and ensuring clarity and accuracy in financial reporting. Youll need a Bachelors degree in finance or a related field, strong analytical skills, proficiency in Excel/Sheets and PowerPoint, and a curiosity for data analysis. This role is ideal for a driven, detail-oriented individual eager to collaborate and grow in a dynamic team environment. Responsibilities Monitoring financial performance and identifying trends. Supporting monthly and quarterly business metric reporting. Conducting ad-hoc analysis for senior management. Consolidating and analyzing financial data. Assisting with financial planning and forecasting. Maintaining a corporate repository of metric definitions.  
posted 3 weeks ago

Political Analyst

ODH DEVELOPERS PRIVATE LIMITED
ODH DEVELOPERS PRIVATE LIMITED
experience13 to 20 Yrs
location
Chennai, Ethiopia+18

Ethiopia, Singapore, Oman, Saudi Arabia, Kuwait, Nalbari, Bangalore, Baloda Bazar, Noida, Sudan, Hyderabad, Kolkata, Pune, Jordan, Mumbai City, Ghana, Kenya, Delhi, Egypt

skills
  • time
  • communication
  • problem
  • budgeting
  • scheduling
  • management
  • leadership
  • project
  • skills
  • solving
  • organizational
Job Description
We are looking for a passionate political analyst to conduct research on political ideas and analyze government policies, political trends and related issues. As a political analyst you should be able to study the development of political systems, research various political subjects and collect and analyze data. Ultimately, you should be able to predict political, social and economic trends, evaluate cultures, values and political ideologies and present unbiased reports. Responsibilities Research political subjects such as foreign relations and political ideologies Collect data from sources such as public opinion surveys and election results Use statistical analysis to interpret research findings Develop political theories based on research and historical documents Forecast political, economic and social trends Present research results by writing reports, giving presentations and publishing articles Evaluate the effects of policies and laws on government, businesses and people Monitor current events, policy decisions and legislation changes Stay up-to-date with international trends and developments Raise public awareness of important political and social issues Establish contacts and sources to use in future research
posted 4 weeks ago

Business Analyst Trainee

ODH DEVELOPERS PRIVATE LIMITED
ODH DEVELOPERS PRIVATE LIMITED
experience13 to 21 Yrs
location
Chennai, Oman+16

Oman, Qatar, Mahasamund, Noida, Samastipur, United Arab Emirates, Hyderabad, Changlang, Kolkata, Malaysia, Pune, Jordan, Mumbai City, Ghana, Delhi, Kenya, Egypt

skills
  • communication skills
  • communication
  • time management
  • leadership
  • budgeting
  • problem solving organizational skills
Job Description
We are looking for a Business Analyst who will be the vital link between our information technology capacity and our business objectives by supporting and ensuring the successful completion of analytical, building, testing and deployment tasks of our software products features. Responsibilities Define configuration specifications and business analysis requirements Perform quality assurance Define reporting and alerting requirements Own and develop relationship with partners, working with them to optimize and enhance our integration Help design, document and maintain system processes Report on common sources of technical issues or questions and make recommendations to product team Communicate key insights and findings to product team Constantly be on the lookout for ways to improve monitoring, discover issues and deliver better value to the customer
posted 4 weeks ago

Product Analyst

ODH DEVELOPERS PRIVATE LIMITED
ODH DEVELOPERS PRIVATE LIMITED
experience12 to 19 Yrs
location
Chennai, Singapore+17

Singapore, Oman, Qatar, Kuwait, Noida, Kokrajhar, United Arab Emirates, Hyderabad, Kozhikode, Kolkata, Malaysia, Pune, Mumbai City, Jordan, Bhavnagar, Ghana, Delhi, Egypt

skills
  • budgeting
  • leadership
  • problem
  • management
  • communication
  • time
  • solving
  • organizational
  • skills
Job Description
We are looking for a Product Analyst to join our team and assist us in recommending the best products to launch to increase profitability in our organization.  Product Analyst responsibilities include looking at market data to determine what products to launch and interviewing customers to understand their needs.  Ultimately, you will work with customers and various leaders in our organization to help us determine what products we should launch to maximize profitability.  Responsibilities Develop and oversee small to medium scale projects Analyze metrics to continually improve company products Contribute to company operations, such as costing, inventory control, planning, and budgeting Assist the company in achieving short and long-term goals relating to product growth Work with other company departments to improve the analysis and presentation of products Requirements and skills Proven work experience as a Product Analyst or similar role Proficient in database software Strong communication skills
posted 6 days ago

databricks developer

Vy Systems Private Limited
experience4 to 8 Yrs
Salary50,000 - 3.5 LPA
WorkRemote
location
Chennai, Bangalore+7

Bangalore, Noida, Hyderabad, Gurugram, Kolkata, Pune, Mumbai City, Delhi

skills
  • sql
  • aws
  • databricks
Job Description
Job Title: Databricks Developer Experience: 48 Years Location: Remote Job Summary We are looking for an experienced Databricks Developer with strong expertise in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred) or Azure. The ideal candidate will design, develop, optimize, and maintain scalable data pipelines and analytics solutions using the Databricks unified data platform. Key Responsibilities Design, build, and optimize ETL/ELT pipelines using Databricks and PySpark. Develop scalable and high-performance data processing workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse). Implement and manage Delta Lake for ACID transactions, schema evolution, and time-travel data needs. Write efficient and complex SQL queries for data transformation, validation, and analytics. Build and maintain data ingestion frameworks, including streaming (Kafka, Kinesis, EventHub) and batch processing. Optimize Databricks clusters, jobs, workflows, and performance tuning PySpark code. Collaborate with data engineers, analysts, and product teams to define data requirements and deliver high-quality solutions. Ensure data governance, security, compliance, and best practices across cloud environments. Troubleshoot production pipelines and support CI/CD deployments for Databricks jobs. Required Skills & Experience 48 years of experience in Data Engineering or Big Data development. Strong hands-on experience with Databricks (clusters, jobs, notebooks, workflows). Advanced proficiency with PySpark for batch/stream processing. Strong programming skills in Python. Expertise in SQL (complex transformations, window functions, optimization). Hands-on experience working with AWS (preferred) or Azure cloud services. Experience with Delta Lake, parquet, and data lake architectures. Familiarity with CI/CD pipelines (GitHub Actions, Azure DevOps, Jenkins, etc.). Good understanding of data modeling, performance tuning, and distributed computing.  APPLY: sanjai@vyystems.com
posted 2 weeks ago

Spark Developer

Early Career
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • ETL
  • Data Processing
  • Data Transformation
  • Python
  • Spark
  • Hadoop
  • Data Engineering
  • SQL
Job Description
As an ETL Developer at our company, your responsibilities will include designing, implementing, and optimizing distributed data processing jobs to handle large-scale data in Hadoop Distributed File System (HDFS) using Apache Spark and Python. You will need a deep understanding of data engineering principles, proficiency in Python, and hands-on experience with Spark and Hadoop ecosystems. Collaboration with data engineers, analysts, and business stakeholders to process, transform, and drive insights and data-driven decisions will also be a key aspect of your role. Key Responsibilities: - Design and implement Spark applications to process and transform large datasets in HDFS. - Develop ETL Pipelines in Spark using Python for data ingestion, cleaning, aggregation, and transformations. - Optimize Spark jobs for efficiency, reducing run time and resource usage. - Fine-tune memory management, caching, and partitioning strategies for optimal performance. - Load data from different sources into HDFS, ensuring data accuracy and integrity. - Integrate Spark Applications with Hadoop frameworks like Hive, Sqoop, etc. - Troubleshoot and debug Spark Job failures, monitor job logs, and Spark UI to identify issues. Qualifications: - 2-5 years of relevant experience - Experience in programming/debugging used in business applications - Working knowledge of industry practice and standards - Comprehensive knowledge of the specific business area for application development - Working knowledge of program languages - Consistently demonstrates clear and concise written and verbal communication - Expertise in handling complex large-scale Warehouse environments - Hands-on experience writing complex SQL queries, exporting and importing large amounts of data using utilities Education: - Bachelor's degree in a quantitative field (such as Engineering, Computer Science) or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required.,
ACTIVELY HIRING
posted 1 week ago
experience7 to 11 Yrs
location
Chennai, Tamil Nadu
skills
  • Python
  • Java
  • GCP
  • API development
  • REST
  • Git
  • DevOps
  • Cloud security
  • Nodejs
  • GKE
  • App Engine
  • Cloud Run
  • Cloud Functions
  • AlloyDB
  • Cloud Spanner
  • Cloud SQL
  • GraphQL
  • CICD
  • AI tools
Job Description
As a Back-End Developer for our US-based client who is a globally recognized service provider of flexible and scalable outsourced warehousing solutions, your role will involve developing a next-generation Warehouse-as-a-Service platform on Google Cloud Platform (GCP). Your expertise in building secure, scalable, and high-performance back-end services in a microservices architecture will be crucial for this position. **Key Responsibilities:** - Design, develop, and maintain microservices and APIs running on GKE, Cloud Run, App Engine, and Cloud Functions. - Build secure, scalable REST and GraphQL APIs to support front-end applications and integrations. - Collaborate with the GCP Architect to ensure back-end design aligns with enterprise architecture and security best practices. - Implement integration layers between GCP-hosted services, AlloyDB, Cloud Spanner, Cloud SQL, and third-party APIs. - Deploy services using Gemini Code Assist, CLI tools, and Git-based CI/CD pipelines. - Optimize service performance, scalability, and cost efficiency. - Implement authentication, authorization, and role-based access control using GCP Identity Platform / IAM. - Utilize AI/ML services (e.g., Vertex AI, Document AI, NLP APIs) to enable intelligent back-end capabilities. - Participate in code reviews and enforce clean, maintainable coding standards. **Qualifications Required:** - 6-8 years of back-end development experience, with at least 3+ years in senior/lead analyst roles. - Proficiency in one or more back-end programming languages: Node.js, Python, or Java. - Strong experience with GCP microservices deployments on GKE, App Engine, Cloud Run, and Cloud Functions. - Deep knowledge of AlloyDB, Cloud Spanner, and Cloud SQL for schema design and query optimization. - Experience in API development (REST/GraphQL) and integration best practices. - Familiarity with Gemini Code Assist for code generation and CLI-based deployments. - Understanding of Git-based CI/CD workflows and DevOps practices. - Experience integrating AI tools into back-end workflows. - Strong understanding of cloud security and compliance requirements. - Excellent communication skills for working in a distributed/global team environment. Please note that this position is an on-site role based in Chennai.,
ACTIVELY HIRING
posted 2 weeks ago

Backend Developer

House of Shipping
experience7 to 11 Yrs
location
Chennai, Tamil Nadu
skills
  • Python
  • Java
  • GCP
  • API development
  • REST
  • Git
  • DevOps
  • Cloud security
  • Nodejs
  • GKE
  • App Engine
  • Cloud Run
  • Cloud Functions
  • AlloyDB
  • Cloud Spanner
  • Cloud SQL
  • GraphQL
  • CICD
  • AI tools
Job Description
As a Back-End Developer at our client's US-based company, you will play a crucial role in developing a next-generation Warehouse-as-a-Service platform on Google Cloud Platform (GCP). Your expertise in building secure, scalable, and high-performance back-end services in a microservices architecture will be key to the success of this project. **Key Responsibilities:** - Design, develop, and maintain microservices and APIs running on GKE, Cloud Run, App Engine, and Cloud Functions. - Build secure, scalable REST and GraphQL APIs to support front-end applications and integrations. - Collaborate with the GCP Architect to ensure back-end design aligns with enterprise architecture and security best practices. - Create integration layers between GCP-hosted services, AlloyDB, Cloud Spanner, Cloud SQL, and third-party APIs. - Deploy services using Gemini Code Assist, CLI tools, and Git-based CI/CD pipelines. - Optimize service performance, scalability, and cost efficiency. - Implement authentication, authorization, and role-based access control using GCP Identity Platform / IAM. - Utilize AI/ML services (e.g., Vertex AI, Document AI, NLP APIs) to enhance intelligent back-end capabilities. - Engage in code reviews and uphold clean, maintainable coding standards. **Qualifications Required:** - 6-8 years of back-end development experience, with a minimum of 3+ years in senior/lead analyst roles. - Proficiency in one or more back-end programming languages: Node.js, Python, or Java. - Strong experience with GCP microservices deployments on GKE, App Engine, Cloud Run, and Cloud Functions. - Deep knowledge of AlloyDB, Cloud Spanner, and Cloud SQL for schema design and query optimization. - Experience in API development (REST/GraphQL) and integration best practices. - Familiarity with Gemini Code Assist for code generation and CLI-based deployments. - Understanding of Git-based CI/CD workflows and DevOps practices. - Experience integrating AI tools into back-end workflows. - Strong understanding of cloud security and compliance requirements. - Excellent communication skills for effective collaboration in a distributed/global team environment.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Chennai, All India
skills
  • SAS
  • SQL
  • RDBMS
  • Unix
  • Tableau
  • MS Excel
  • PowerPoint
  • VBA
  • Jira
  • Bitbucket
  • Sharepoint
  • Mainframes
  • Big data
  • Python
  • Data analysis
  • Data profiling
  • MIS reporting
  • Communication skills
  • Interpersonal skills
  • Competency development
  • Training
  • Solution implementation
  • Teradata
  • Banking domain knowledge
  • Risk control metrics
  • Audit framework exposure
  • Processproject management skills
  • Organizational building activities
  • Proactive problemsolving
  • Attention to detail
  • Identifying process gaps
Job Description
As part of the Remediation and Analytics team at AIM, you will be responsible for managing the analysis of customer remediation issues globally, particularly in the retail consumer bank sector. Your main areas of focus will include: - Remediation analysis: Executing a comprehensive data remediation approach on customer issues stemming from observed gaps in policies, governance, self-identification, or through internal audit findings. - Impact assessment: Identifying the number of affected customers and the financial impact resulting from these issues. - Issue Management & Root cause analysis: Utilizing analytical methods to identify underlying issues and their root causes. - Audit Support: Tracking implementation plans and providing necessary data evidence and artifacts for audit completion. To excel in this role, you will need expertise in the following areas: Tools and Platforms: - Proficiency in SAS, SQL, RDBMS, Teradata, Unix, Tableau - Proficiency in MS Excel, PowerPoint, and VBA - Familiarity with Jira, Bitbucket, Sharepoint, and Mainframes - Exposure to Big data and Python Domain Skills: - Good understanding of banking domain and consumer products such as Retail Banking, Deposit, Loans, Wealth management, Mortgage, and Insurance - Preferred knowledge of Finance Regulations and understanding of Retail Business/Banking Domain Analytical Skills: - Ability to identify, articulate, and solve complex business problems and present them in a structured and simplified manner - Proficiency in data analysis, data profiling, and data management - Experience in MIS reporting and generating actionable business insights - Developing automated techniques to enhance optimization and reduce redundancy - Identifying control gaps and providing recommendations based on data strategy - Exposure to Risk & Control Metrics and Audit Framework is preferred Interpersonal Skills: - Excellent communication and interpersonal skills - Strong process/project management abilities - Collaborative mindset to work effectively across multiple functional areas - Thrives in dynamic and fast-paced environments - Proactive approach to problem-solving and attention to detail - Contribution to organizational initiatives in competency development, training, and building activities Additionally, you should hold a Master's or Advanced Degree in Information Technology, Computer Applications, Engineering, or MBA from a premier institute, along with at least 5-8 years of overall experience and a minimum of 2 years in the Banking Industry delivering data solutions. As part of the Remediation and Analytics team at AIM, you will be responsible for managing the analysis of customer remediation issues globally, particularly in the retail consumer bank sector. Your main areas of focus will include: - Remediation analysis: Executing a comprehensive data remediation approach on customer issues stemming from observed gaps in policies, governance, self-identification, or through internal audit findings. - Impact assessment: Identifying the number of affected customers and the financial impact resulting from these issues. - Issue Management & Root cause analysis: Utilizing analytical methods to identify underlying issues and their root causes. - Audit Support: Tracking implementation plans and providing necessary data evidence and artifacts for audit completion. To excel in this role, you will need expertise in the following areas: Tools and Platforms: - Proficiency in SAS, SQL, RDBMS, Teradata, Unix, Tableau - Proficiency in MS Excel, PowerPoint, and VBA - Familiarity with Jira, Bitbucket, Sharepoint, and Mainframes - Exposure to Big data and Python Domain Skills: - Good understanding of banking domain and consumer products such as Retail Banking, Deposit, Loans, Wealth management, Mortgage, and Insurance - Preferred knowledge of Finance Regulations and understanding of Retail Business/Banking Domain Analytical Skills: - Ability to identify, articulate, and solve complex business problems and present them in a structured and simplified manner - Proficiency in data analysis, data profiling, and data management - Experience in MIS reporting and generating actionable business insights - Developing automated techniques to enhance optimization and reduce redundancy - Identifying control gaps and providing recommendations based on data strategy - Exposure to Risk & Control Metrics and Audit Framework is preferred Interpersonal Skills: - Excellent communication and interpersonal skills - Strong process/project management abilities - Collaborative mindset to work effectively across multiple functional areas - Thrives in dynamic and fast-paced environments - Proactive approach to problem-solving and attention to detail - Contribution to organizational initiatives in competency development, training, and building activities Additionally, you should hold a Master's or Advanced Degree in Information Technology, Computer Applications, Engineering, or MBA fr
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
Chennai, Tamil Nadu
skills
  • Analytical Skills
  • Software Development
  • System integration
  • Programming expertise
  • Database knowledge
  • Blockchain technology
  • Cryptocurrencies
  • Problemsolving
Job Description
As a Crypto Analyst & Blockchain Software Developer at ZakApps, your role will involve analyzing crypto trends, developing blockchain software, and integrating it with databases and other systems on a day-to-day basis. Key Responsibilities: - Analyzing crypto trends to make informed decisions - Developing blockchain software for various applications - Integrating blockchain solutions with databases and other systems - Staying updated with the latest developments in blockchain technology and cryptocurrencies - Collaborating with cross-functional teams for system integration projects Qualifications Required: - Strong analytical skills and programming expertise - Proficiency in software development and database management - Prior experience in system integration - Familiarity with blockchain technology and cryptocurrencies - Excellent problem-solving abilities - Bachelor's degree in Computer Science or a related field ZakApps is a specialized solution provider and product developer for the Retail, Digital Media, and Advertisement industries. The company is dedicated to creating innovative products that inspire projects from inception to completion. With a focus on delivering high-quality services to global customers, ZakApps leverages top talent for cost-effective delivery.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter