dynamodb-jobs

520 Dynamodb Jobs

Toggle to save search
posted 1 day ago

Senior AWS Cloud Engineer

Princeton IT America
experience3 to 7 Yrs
location
Karnataka
skills
  • AWS
  • ECS
  • NLB
  • DynamoDB
  • IAM
  • CICD pipelines
  • IaC
  • EC2
  • EKS
  • ALB
  • RDS
  • Aurora
  • S3
  • CloudFront
  • VPC
  • Security Groups
  • SQS
  • SNS
Job Description
As a Senior AWS Cloud Engineer, your role will involve designing, building, and maintaining scalable and secure AWS infrastructure. You will be responsible for architecting VPCs, Subnets, NAT, Gateways, Load Balancers, and IAM roles. Additionally, you will deploy and manage containerized workloads using ECS or EKS, and build CI/CD pipelines using tools like GitHub Actions, CodePipeline, or Jenkins. Automation of infrastructure using Terraform or CloudFormation will also be a key aspect of your responsibilities. You will design architecture diagrams, provide cost estimations, and ensure system reliability, performance, and high availability. Monitoring workloads using CloudWatch, X-Ray, alarms, and dashboards, as well as implementing AWS security best practices, will be crucial. Furthermore, optimizing cloud costs through rightsizing, reserved plans, tagging, and lifecycle policies will be part of your role. Key Responsibilities: - Design, build, and maintain scalable and secure AWS infrastructure - Architect VPCs, Subnets, NAT, Gateways, Load Balancers, and IAM roles - Deploy and manage containerized workloads using ECS or EKS - Build and implement CI/CD pipelines (GitHub Actions, CodePipeline, or Jenkins) - Automate infrastructure using Terraform or CloudFormation - Design architecture diagrams and cost estimations - Ensure system reliability, performance, and high availability - Monitor workloads using CloudWatch, X-Ray, alarms, and dashboards - Implement AWS security best practices (IAM, MFA, KMS, Guard Duty, Config) - Optimize cloud costs with rightsizing, reserved plans, tagging, and lifecycle policies Qualifications Required: - 3+ years of hands-on AWS experience (not just administration) - Proven experience setting up AWS infrastructure from scratch - Experience building CI/CD pipelines - Experience with IaC (Terraform / CloudFormation) - Strong understanding of AWS core services: EC2, ECS/EKS, ALB/NLB, RDS/Aurora, DynamoDB, S3, CloudFront, IAM, VPC, Security Groups, SQS / SNS - Ability to troubleshoot complex cloud issues across compute, networking & storage - Startup experience preferred (fast-paced, multi-role environment) Please note that this is a full-time position located in Bangalore, with a hybrid/onsite work policy as per the company's guidelines.,
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 1 day ago

Lead Software Engineer - .Net

JPMC Candidate Experience page
experience5 to 9 Yrs
location
Maharashtra, Pune
skills
  • C
  • AWS
  • DynamoDB
  • Relational Databases
  • MySQL
  • DB2
  • SQL Server
  • Jenkins
  • net core
  • S3
  • SNS
  • SQS
  • Lambda
  • Microservices architecture
  • Oracle DB
  • OOPs concepts
  • Version Control management GIT
  • CICD
Job Description
As a Lead Software Engineer at JPMorganChase within the Consumer and Community Banking - Chase Travel, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products securely and at scale. Your responsibilities include: - Executing creative software solutions, design, development, and technical troubleshooting with a focus on innovative problem-solving - Developing secure high-quality production code, reviewing and debugging code from team members - Identifying opportunities to automate remediation of recurring issues for improved operational stability - Leading evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented architectural designs and technology integration - Leading communities of practice to promote awareness and adoption of new technologies - Fostering a team culture of diversity, opportunity, inclusion, and respect The qualifications, capabilities, and skills required for this role are: - Formal training or certification in software engineering concepts with at least 5 years of applied experience - Proficiency in C# and .NET Core with strong analytical and logical skills - Hands-on experience in background tasks with hosted services - Experience in developing secure, high-performance, and complex APIs - Familiarity with AWS services like S3, SNS, SQS, Lambda, DynamoDB, and microservices architecture - Ability to multitask and manage multiple priorities effectively - Proficiency in at least one relational database like MySQL, Oracle DB, DB2, or SQL Server - Good understanding of database design and design patterns - Excellent grasp of OOPs concepts - Positive attitude and willingness to assist others in problem resolution - Experience with version control management using GIT - Knowledge of modern development technologies and tools such as CI/CD, Jenkins, etc.,
ACTIVELY HIRING
posted 1 day ago

DotNet

Birlasoft
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • C
  • MongoDB
  • JIRA
  • ECS
  • DynamoDB
  • Git
  • Confluence
  • ServiceNow
  • NET Core
  • AWS SDK
  • CICD pipelines
  • NoSQL databases
  • AWS DynamoDB
  • Microsoft NET Framework
  • AWS services
  • EC2
  • Lambda
  • SNS
  • SQS
  • EventBridge
  • CloudWatch
  • Angular 8
  • Lucid portal
Job Description
Role Overview: You are looking for a skilled DotNet with AWS Developer to join your team. The ideal candidate should have a strong background in .NET Core, C#, and AWS services, along with experience in developing and integrating applications using CI/CD pipelines. As a part of this role, you will be involved in the full lifecycle development process, from requirements analysis to deployment and maintenance. Key Responsibilities: - Develop and integrate requirements using CI/CD code pipeline with GitHub - Participate in the full development lifecycle - Serve as a technical expert on projects - Write technical specifications - Support and maintain software functionality - Evaluate new technologies - Analyze and revise code - Participate in software design meetings - Consult with end users Qualifications Required: - Proficiency in .NET Core, C#, AWS SDK - Experience with NoSQL databases like MongoDB and AWS DynamoDB - Working with JIRA, Microsoft .NET Framework, and supported programming languages - Strong understanding of AWS services such as EC2, ECS, Lambda, SNS, SQS, EventBridge, DynamoDB, and CloudWatch - Experience in backend development using C# and .NET Core - Version control using Git with Copilot Additional Details: You will be working with tools and technologies such as GitHub Desktop, Visual Studio Code, Visual Studio IDE (Professional 2022 with GitHub Copilot), Teams, and Outlook for communication. Soft skills required for this role include strong communication skills in English and the ability to complete tasks within the estimated time by the team.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 5 days ago
experience2 to 6 Yrs
location
Haryana
skills
  • Python
  • Django
  • Flask
  • SQL
  • MySQL
  • PostgreSQL
  • MongoDB
  • Elasticsearch
  • DynamoDB
  • Spark
  • Kafka
  • AWS
  • Airflow
  • Docker
  • Snowflake
  • Kubernetes
  • Chef
  • Redshift
  • ELK
  • Grafana
  • Terraform
Job Description
As a Data Engineer at our company, your role will involve building, maintaining, and monitoring sustainable infrastructure with a focus on developing and maintaining high-volume, low-latency data platforms using the latest open-source frameworks, highly available RESTful services, and back-end Cloud based/open source systems. Your responsibilities will include: - Executing day-to-day operation of the data platform to ensure top-notch performance of data platform systems for customer-facing applications. - Working on features to make the data platform more robust and resilient. - Driving automation of application deployment for production environments. - Identifying and supporting areas for process and efficiency improvement within platform operations. - Taking ownership of your projects and working in small cross-functional teams to drive projects to closure. - Being highly collaborative with a strong belief in continuous improvement. - Demonstrating the ability to stay focused and calm in a fast-paced environment. To be successful in this role, you should have: - A BE/B-Tech/BS/Meng/MSc in computer science or a related technical field from top-tier colleges or equivalent practical experience. - 2.5-5 years of experience working on platform infrastructure based on Cloud and open-source systems and automation. - Proficiency in programming languages, particularly Python. - Experience with Django/Flask for building APIs. - Knowledge of databases including SQL querying & database understanding (RDBMS: MySQL, PostgreSQL; NoSQL: MongoDB, Elasticsearch, DynamoDB). - Familiarity with big data tools & distributed systems such as Spark, Redshift, and Kafka. - Experience with Cloud Computing, preferably AWS. - Familiarity with orchestration tools like Airflow and Docker. It would be good to have experience with: - Monitoring/Logging frameworks like ELK and Grafana. - Scalable Compute solutions like Kafka and Snowflake. - Workflow management libraries and tools. - Container orchestration systems like Kubernetes. - Code-as-Infrastructure/Pipeline-as-Code tools like Terraform and Chef. Additionally, understanding and functional experience in Big Data Engineering practices such as Data Warehousing, ingestion pipelines, ELT, data migration, and QC would be beneficial for this role.,
ACTIVELY HIRING
posted 6 days ago

React.js Intern

Codes For Tomorrow
experience1 to 5 Yrs
location
Indore, Madhya Pradesh
skills
  • JavaScript
  • MongoDB
  • PostgreSQL
  • MySQL
  • DynamoDB
  • Docker
  • Kubernetes
  • Mocha
  • Git
  • GitHub
  • GitLab
  • Bitbucket
  • Nodejs
  • Expressjs
  • NestJS
  • Typescript
  • Jest
  • Chai
Job Description
As a Node.js Developer at Codes For Tomorrow (CFT), you will be responsible for developing, testing, and maintaining high-performance backend applications using Node.js. Your key responsibilities will include: - Designing and implementing RESTful APIs and integrating third-party services. - Working with databases such as MongoDB, PostgreSQL, or MySQL. - Optimizing applications for speed, scalability, and security. - Collaborating with frontend developers, designers, and product managers to build full-stack solutions. - Implementing authentication and authorization mechanisms (OAuth, JWT, etc.). - Writing clean, efficient, and maintainable code following best practices. - Conducting code reviews, troubleshooting, and debugging applications. - Working with cloud services like AWS, Azure, or Google Cloud. - Implementing caching, message queues, and background jobs using Redis, RabbitMQ, Kafka, etc. - Monitoring application performance and improving system reliability. - Staying up-to-date with the latest industry trends and technologies. Qualifications required for this role include: - Strong proficiency in JavaScript & Node.js. - Experience with Express.js, NestJS, Typescript. - Knowledge of asynchronous programming and event-driven architecture. - Proficiency in working with databases (MongoDB, PostgreSQL, MySQL, or DynamoDB). - Experience with microservices architecture and containerization (Docker, Kubernetes). - Familiarity with unit testing & integration testing using Jest, Mocha, or Chai. - Hands-on experience with version control systems (Git, GitHub, GitLab, or Bitbucket). - Understanding of CI/CD pipelines and deployment automation. - Strong problem-solving and analytical skills. - Good communication skills and the ability to work in a collaborative team environment. Codes For Tomorrow (CFT) is a company that values continuous learning and innovation. The salary for this position is negotiable based on the interview. For the Developer profile, there is a bond period of 1 year. If you are interested, you can reach out to Codes For Tomorrow (CFT) at 0731 4058698 or visit their website at [http://codesfortomorrow.com/](http://codesfortomorrow.com/). The company is located at B/35, Veena Nagar Near Sukhliya MR10 Indore. This is a Full-time job opportunity that may require in-person work at the Indore location.,
ACTIVELY HIRING
posted 6 days ago

Chapter Lead (AI)

Commonwealth Bank
experience14 to 18 Yrs
location
Karnataka
skills
  • HTML
  • JavaScript
  • CSS
  • Python
  • DynamoDB
  • Apache Spark
  • Agile
  • Git
  • NET Core
  • React JS
  • TypeScript
  • AWS SNS
  • SQS
  • Lambda
  • MLOps
  • MLflow
  • Langfuse
  • LlamaIndex
  • Prompt Engineering
  • RetrievalAugmented Generation RAG
  • Redshift
  • Parquet
  • Iceberg
  • TestDriven Development TDD
Job Description
As a Chapter Lead (AI) at CommSec, your role is crucial in building scalable agentic AI solutions that align with business objectives and integrate with existing systems. Your responsibilities include implementing automated validation of LLM outputs, defining performance metrics for AI outcomes, and applying ethical AI practices using appropriate tools and frameworks. You will also utilize AWS cloud services, work with big data technologies, collaborate with software engineers to deploy AI models in production, and develop monitoring systems to track AI model performance in live environments. Additionally, you will participate in research initiatives to explore and apply emerging AI models and methodologies, analyze systems and applications for enhancements, and contribute to the technical design and architecture of system enhancements. Your role will also involve leading and supporting software development efforts across geographically dispersed teams, assisting in team management and leadership tasks, and following structured release and change management processes for software builds and deployments. Troubleshooting complex deployment and environment-related issues and preparing documentation for deployment and configuration scripts will also be part of your responsibilities. **Key Responsibilities:** - Build scalable agentic AI solutions that integrate with existing systems and align with business objectives. - Implement automated validation of LLM outputs and define performance metrics for AI outcomes. - Apply ethical AI practices using appropriate tools and frameworks. - Utilize AWS cloud services including SNS, SQS, and Lambda for AI and system operations. - Work with big data technologies such as Apache Spark and Vector Databases. - Collaborate with software engineers to deploy AI models in production, ensuring robustness and scalability. - Develop monitoring systems to track AI model performance in live environments. - Participate in research initiatives to explore and apply emerging AI models and methodologies. - Analyze systems and applications, providing recommendations for enhancements and future development. - Contribute to the technical design and architecture of system enhancements. - Lead and support software development efforts across geographically dispersed teams. - Assist in team management and leadership tasks within collaborative environments. - Follow structured release and change management processes for software builds and deployments. - Troubleshoot complex deployment and environment-related issues. - Prepare and document deployment and configuration scripts for development, test, and production environments. **Qualifications Required:** - 14+ years of hands-on software development experience, with a strong foundation in building scalable web applications and APIs. - Proficient in .NET Core, React JS, TypeScript, HTML, JavaScript, and CSS, with a solid understanding of web architecture and front-end/backend integration. - Skilled in Python for AI/ML development, including traditional machine learning techniques and modern frameworks. - Experience with cloud technologies, especially AWS (SNS, SQS, Lambda), and familiarity with container systems. - Strong grasp of MLOps practices and tools such as MLflow, Langfuse, and LlamaIndex, enabling efficient deployment and monitoring of AI models. - Knowledge of Prompt Engineering, Retrieval-Augmented Generation (RAG), and vector databases (e.g., DynamoDB, Redshift). - Experience with big data frameworks like Apache Spark, and data formats such as Parquet and Iceberg. - Comfortable working in Agile environments and applying Test-Driven Development (TDD) methodologies. - Familiar with source control systems (e.g., Git) and build systems. - Strong communication skills able to articulate technical decisions and collaborate effectively with cross-functional teams. - Self-driven and capable of working independently or as part of a team. - Passionate about clean code, best practices, and continuous improvement. - Demonstrates a growth mindset with a willingness to learn new tools and technologies. If you are a Bachelor's / masters degree holder in engineering/ information systems from an accredited university and possess the essential skills mentioned above, you are encouraged to apply for this challenging and rewarding role at CommSec in Bangalore-Manyata Tech Park.,
ACTIVELY HIRING
posted 5 days ago
experience4 to 8 Yrs
location
Haryana
skills
  • Node
  • JavaScript
  • Couchbase
  • API development
  • SQL Server
  • MySQL
  • PostgreSQL
  • Oracle
  • Couchbase
  • Cassandra
  • RabbitMQ
  • Kafka
  • Active MQ
  • React
  • Typescript
  • Microservices Architecture
  • ServiceOriented Architecture
  • EventDriven Architecture
  • AWS DynamoDB
  • Software as a Service SaaS
  • RESTful Web Services
  • AWS DynamoDB
  • SQS
Job Description
Role Overview: As a technical leader, your primary responsibility will be to mentor a team of engineers in developing Internet-scale applications with a focus on performance, reliability, and scalability. You will collaborate closely with Product and Design teams to create customer-centric solutions. Your role will involve creating intuitive and interactive web applications using Node, React, and Typescript/Javascript. Additionally, you will be responsible for implementing Microservices Architecture, Service-Oriented Architecture (SOA), and Event-Driven Architecture (EDA) in real-life applications. Working with database technologies, including traditional relational databases and NoSQL products like AWS DynamoDB and Couchbase, will be part of your daily tasks. You will have the opportunity to work with top engineers on complex Software as a Service (SaaS) applications. Key Responsibilities: - Provide technical leadership and mentorship to a team of engineers - Collaborate with Product and Design teams to develop customer solutions - Create intuitive and interactive web applications using Node, React, and Typescript/Javascript - Implement Microservices Architecture, SOA, and EDA in applications - Work with various database technologies, including relational and NoSQL databases - Contribute to the architecture and design of software applications - Lead engineering teams in technical discussions and vision building - Work on full stack development, from front-end user interfaces to backend systems Qualifications Required: - Bachelor's degree (or higher) in Computer Science or related technical discipline - Strong understanding of Computer Science fundamentals - Excellent verbal and written communication skills - Strong analytical skills and ability to handle complex software development - 4 to 7 years of software development experience - Hands-on programming experience with Node, JavaScript, and TypeScript - Experience with RESTful Web Services and API development - Knowledge of Design Patterns, Non-Functional Requirements (NFRs), and databases - Experience with Queuing technologies such as SQS, RabbitMQ, Kafka, or Active MQ - Experience leading an engineering team Additional Details: Omit this section as no additional company details are provided in the job description.,
ACTIVELY HIRING
posted 2 days ago
experience2 to 6 Yrs
location
Maharashtra, Pune
skills
  • JavaScript
  • Go
  • Event handling
  • Azure
  • Communication skills
  • Mentoring
  • Automated testing
  • DynamoDB
  • Redis
  • Memcached
  • Monitoring tools
  • DOM manipulation
  • Web APIs
  • Content Delivery Networks CDNs
  • Cloud platforms AWS
  • NoSQL databases
  • Crossbrowser compatibility testing
  • CDN edge functions
  • workers
  • ScyllaDB
  • Testing frameworks
  • Web accessibility standards
  • Agile development practices
  • Kubernetes deployments
  • CICD pipelines
  • Observability tools
  • Performance analysis tools
  • Technical architecture discussions
Job Description
As a Software Engineer at Arkose Labs, you will play a crucial role in the Client Integrations team. Your primary responsibility will be the development of key systems used internally and externally. You will focus on frontend development of client-side JavaScript to ensure it is performant and scalable across various browsers and devices. Additionally, you will have the opportunity to work on backend services written in Go, supporting client-side code with robust APIs. **Role Overview:** In this role, you will design, test, and implement features for client-side JavaScript embedded into millions of requests daily. You will collaborate with cross-functional teams to build innovative features to combat fraud and bots. Your work will involve developing cutting-edge fingerprinting and device identification methods and taking ownership of challenging projects. **Key Responsibilities:** - Design, test, and implement features for client-side JavaScript - Collaborate with cross-functional teams to combat fraud and bots - Develop cutting-edge fingerprinting and device identification methods - Take ownership of challenging projects - Maintain system health by reviewing code, writing tests, and reducing technical debt - Contribute to technical discussions and quarterly planning - Share learnings through documentation and knowledge sharing **Qualifications Required:** - 2-4 years of relevant software engineering experience - Proficiency in JavaScript and backend languages (Go) - Experience with DOM manipulation, event handling, and Web APIs - Proficiency in modern frontend frameworks and libraries - Experience with CDNs, cloud platforms (AWS, Azure), and NoSQL databases - Cross-browser compatibility testing experience - Strong communication skills and mentoring experience - Understanding of code quality impact and automated testing - Comfortable participating in an equitable team on-call roster Arkose Labs is a technology-driven company dedicated to protecting enterprises from cybercrime and abuse. With a global customer base that includes Fortune 500 companies, Arkose Labs offers a collaborative ecosystem where you can actively partner with influential brands to tackle technical challenges and safeguard millions of users worldwide. Join Arkose Labs to work with cutting-edge technology, foster innovation, and collaborate with experienced leadership. As part of a diverse and high-performing environment, you will have the opportunity to contribute your unique perspectives and experiences to drive global change. Benefits of working at Arkose Labs include a competitive salary, equity opportunities, a beautiful office space with perks, a robust benefits package, provident fund, accident insurance, flexible working hours, and support for personal well-being and mental health. At Arkose Labs, we value people, teamwork, customer focus, execution, and security in all aspects of our work.,
ACTIVELY HIRING
posted 4 days ago

Azure

Virtusa
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • IaaS
  • Storage
  • Networking
  • ARM
  • MySQL
  • DynamoDB
  • TCPIP
  • DNS
  • SMTP
  • HTTP
  • Git
  • Jenkins
  • Chef
  • Puppet
  • Docker
  • Kubernetes
  • monitoring
  • provisioning
  • Azure platform Services
  • Security principles
  • Azure AD
  • AKS
  • Containers
  • Azure Security
  • Terraform
  • MSSQL
  • NoSQL DBs
  • cloud architecture patterns
  • DevOps solutions
  • Terraform
  • AKS
  • Engaging with client development
Job Description
As an experienced Azure Cloud Architect, you will be responsible for architecting and delivering solutions leveraging various platform services within the Azure platform. Your role will involve understanding the capabilities and limitations of Azure platform services and ensuring the security principles and measures are in place. Key Responsibilities: - Direct experience in a wide range of services from the Microsoft Azure Cloud Platform, including Infrastructure and Security related services such as Azure AD, IaaS, AKS, Containers, Storage, Networking, and Azure Security. - Shape enterprise solutions and develop Microsoft Azure Cloud architecture with excellent documentation skills. - Set up, deploy, and manage multiple environments to support agile development approaches. - Hands-on experience in migrating large-scale complex enterprise applications to Azure Cloud and designing operations processes. - Design and implement high availability and disaster recovery solutions with ARM and Terraform. - Knowledge of MSSQL, MySQL, and NoSQL DBs such as DynamoDB. - Solid understanding of networking and core internet protocols, cloud architecture patterns, and solution design principles. - Implement DevOps solutions integrating various tools based on client/business requirements. - Configure monitoring for different attributes and handle scale-up and scale-down scenarios for applications in Azure. - Provide best practices for provisioning production and non-production environments on Azure to optimize usage. - Engage with client development, infrastructure, security, network, and IT operations teams as part of cloud solutioning. Qualifications Required: - Strong, in-depth, and demonstrable hands-on experience with Microsoft Azure and its relevant build, deployment, automation, networking, and security technologies in cloud and hybrid environments. - Possession of either the Developing Microsoft Azure Solutions and Architecting Microsoft Azure certifications. This job requires a deep understanding of Azure platform services, security principles, and experience in enterprise solution shaping and cloud architecture development. Additionally, hands-on experience in migrating enterprise applications to Azure Cloud and designing operations processes are essential. Possessing certifications in Microsoft Azure Solutions and Architecting Microsoft Azure will be beneficial for this role.,
ACTIVELY HIRING
posted 2 days ago
experience3 to 7 Yrs
location
Karnataka
skills
  • Consulting
  • Snowflake
  • AWS
  • Python
  • Spark
  • Glue
  • EMR
  • DynamoDB
  • JSON
  • Avro
  • ORC
  • Industry experience in RetailCPGMedia
  • clientfacing experience
  • Certifications AWS
  • Databricks
  • Data Engineering
  • Data Modeling skills
  • Experience in Extract Transform Load ETL processes
  • Data Warehousing
  • Data Analytics skills
  • Proficiency in relevant programming languages like SQL
  • Python
  • Experience with cloud services like AWS
  • Databricks
  • Strong analytical
  • problemsolving skills
  • Programming Python
  • AWS Services S3
  • Lambda
  • Step Functions
  • Databricks Delta Lake
  • MLflow
  • Unity Catalog experience
  • Databases SQL databases PostgreSQL
  • MySQL
  • NoSQL MongoDB
  • Data Formats
Job Description
You are applying for the role of Senior Data Engineer at Beige Bananas, a rapidly growing AI consulting firm specializing in creating custom AI products for Fortune 500 Retail, CPG, and Media companies with an outcome-driven mindset to accelerate clients" value realization from their analytics investments. **Role Overview:** As a Senior Data Engineer at Beige Bananas, you will be responsible for data engineering, data modeling, ETL processes, data warehousing, and data analytics. You will work independently to build end-to-end pipelines in AWS or Databricks. **Key Responsibilities:** - Design and implement scalable data architectures - Build and maintain real-time and batch processing pipelines - Optimize data pipeline performance and costs - Ensure data quality, governance, and security - Collaborate with ML teams on feature stores and model pipelines **Qualifications Required:** - Data Engineering and Data Modeling skills - Experience in Extract Transform Load (ETL) processes - Data Warehousing and Data Analytics skills - Proficiency in programming languages like SQL and Python - Experience with cloud services like AWS and Databricks - Strong analytical and problem-solving skills - Bachelor's or Master's degree in Computer Science, Engineering, or related field **Additional Details of the Company:** Beige Bananas is a pure play AI consulting firm that focuses on creating hyper-custom AI products for Fortune 500 Retail, CPG, and Media companies. They have a fast-paced environment with a focus on accelerating clients" value realization from their analytics investments. If you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, apply today and be a part of Beige Bananas" exciting journey!,
ACTIVELY HIRING
posted 2 days ago
experience7 to 11 Yrs
location
Karnataka
skills
  • leadership skills
  • MongoDB
  • ECS
  • API Gateway
  • DynamoDB
  • microservices
  • database design
  • containerization
  • Docker
  • Kubernetes
  • analytical skills
  • communication skills
  • MERN stack
  • AWS cloud services
  • AIMLbased solutions
  • Expressjs
  • Reactjs
  • Nodejs
  • AWS services
  • Lambda
  • S3
  • RDS
  • CloudFront
  • serverless architectures
  • RESTGraphQL APIs
  • DevOps practices
  • CICD pipelines
  • problemsolving
  • AIML concepts
  • AI frameworks
  • TensorFlow
  • PyTorch
  • security best practices
  • compliance standards
Job Description
Role Overview: Enphase Energy, a global energy technology company specializing in solar, battery, and electric vehicle charging products, is seeking an experienced Software Architect to join the team. Your role will involve designing and delivering scalable, high-performance applications, guiding a development team, aligning technology strategy with business objectives, and driving successful project delivery. This position requires hands-on expertise in the MERN stack, AWS cloud services, and strong leadership skills. Key Responsibilities: - Define, design, and implement scalable, secure, and high-availability architectures across web, backend, and cloud environments - Provide guidance to the engineering team, ensuring adherence to best practices, coding standards, and architectural principles - Architect and integrate AWS services (e.g., Lambda, ECS, S3, API Gateway, DynamoDB, RDS, CloudFront, etc.) to build and optimize distributed systems - Oversee the design and implementation of applications using MongoDB, Express.js, React.js, and Node.js (MERN stack) - Optimize application performance and scalability to handle growing user and data demands - Work closely with product managers, stakeholders, and cross-functional teams to translate business requirements into technical solutions - Lead, coach, and motivate a team of developers, fostering a culture of learning, innovation, and accountability - Ensure timely delivery of high-quality solutions by enforcing code reviews, automated testing, and CI/CD practices Qualifications Required: - 7+ years of professional experience in software development, with at least 3 years in an architectural or leadership role - Strong expertise in the MERN stack and modern JavaScript/TypeScript development - Hands-on experience designing and deploying AWS-based architectures - Solid understanding of microservices, serverless architectures, and REST/GraphQL APIs - Proficiency in database design and management (SQL and NoSQL) - Experience with DevOps practices, CI/CD pipelines, and containerization (Docker, Kubernetes) - Strong problem-solving, analytical, and communication skills - Proven experience in managing development teams and delivering complex projects successfully Additional Company Details: Enphase Energy, founded in 2006, revolutionized solar power with its innovative microinverter technology. Today, the Enphase Energy System enables users to make, use, save, and sell their own power. With over 80 million products shipped to 160 countries, Enphase is recognized as one of the most successful and innovative clean energy companies globally. Join Enphase's dynamic teams to design and develop next-gen energy technologies and contribute to a sustainable future. This role requires working onsite 3 days a week, with plans to transition back to a full 5-day in-office schedule over time.,
ACTIVELY HIRING
posted 4 days ago
experience3 to 7 Yrs
location
Hyderabad, Telangana
skills
  • DynamoDB
  • PostgreSQL
  • MySQL
  • AWS
  • ECS
  • API Gateway
  • Redis
  • Nodejs
  • ExpressJS
  • NestJS
  • Fastify
  • TypeScript
  • DocumentDB
  • SNS
  • SQS
  • Kinesis
  • Lambda
  • EC2
  • RDS
  • GitHub Actions
Job Description
As a Backend Developer at DAZN Hyderabad, your role will involve designing and developing backend services using Node.js with TypeScript. You will be responsible for building scalable APIs and microservices that are aligned with high availability and performance standards. Working with cross-functional teams, including frontend, DevOps, and QA, you will write clean, maintainable code with solid unit test coverage. Additionally, you will ensure code quality through code reviews, CI/CD pipelines, and logging. Key Responsibilities: - Design and develop backend services using Node.js with TypeScript - Build scalable APIs and microservices aligned with high availability and performance - Work with cross-functional teams including frontend, DevOps, and QA - Write clean, maintainable code with solid unit test coverage - Ensure code quality through code reviews, CI/CD pipelines, and logging Qualifications Required: - Strong proficiency in Node.js and TypeScript - Hands-on experience in designing RESTful APIs and microservices - Exposure to AWS cloud services - Familiarity with Agile development and DevOps practices - Excellent problem-solving skills and ownership mindset Please note that the company details were not provided in the job description.,
ACTIVELY HIRING
posted 1 day ago
experience5 to 9 Yrs
location
Chennai, Tamil Nadu
skills
  • analytics
  • data processing
  • DynamoDB
  • Postgres
  • Athena
  • data quality
  • access control
  • compliance
  • validation
  • logging
  • Python
  • Spark
  • OAUTH
  • Tableau
  • AWS S3
  • Delta Tables
  • schema design
  • partitioning
  • workflow automation
  • governance practices
  • data pipelines
  • Delta Lake framework
  • AWS Step functions
  • Event Bridge
  • AppFlow
Job Description
Job Description: You have extensive experience in analytics and large-scale data processing across diverse data platforms and tools. Your responsibilities will include: - Managing data storage and transformation across AWS S3, DynamoDB, Postgres, and Delta Tables with efficient schema design and partitioning. - Developing scalable analytics solutions using Athena and automating workflows with proper monitoring and error handling. - Ensuring data quality, access control, and compliance through robust validation, logging, and governance practices. - Designing and maintaining data pipelines using Python, Spark, Delta Lake framework, AWS Step functions, Event Bridge, AppFlow, and OAUTH. Qualification Required: - Extensive experience in analytics and large-scale data processing. - Proficiency in working with data platforms such as AWS S3, DynamoDB, Postgres, and Delta Tables. - Strong skills in schema design, partitioning, and data transformation. - Experience in developing scalable analytics solutions using tools like Athena. - Proficient in programming languages like Python and Spark. Additional Details: The tech stack you will be working with includes S3, Postgres, DynamoDB, Tableau, Python, and Spark.,
ACTIVELY HIRING
posted 2 days ago

Lead Software Engineer - .Net + AWS

Chase- Candidate Experience page
experience5 to 9 Yrs
location
Maharashtra, Pune
skills
  • C
  • AWS
  • DynamoDB
  • Relational Databases
  • MySQL
  • DB2
  • SQL Server
  • Jenkins
  • net core
  • S3
  • SNS
  • SQS
  • Lambda
  • Microservices architecture
  • Oracle DB
  • OOPs concepts
  • Version Control management GIT
  • CICD
Job Description
Role Overview: As a Lead Software Engineer at JPMorganChase within the Consumer and Community Banking - Chase Travel, you will be an integral part of an agile team working on enhancing, building, and delivering trusted market-leading technology products in a secure, stable, and scalable manner. Your role as a core technical contributor involves conducting critical technology solutions across multiple technical areas to support the firm's business objectives. You will have the opportunity to think creatively and execute software solutions beyond routine approaches to solve technical problems effectively. Key Responsibilities: - Execute creative software solutions, design, development, and technical troubleshooting - Develop secure high-quality production code and review/debug code written by others - Identify opportunities to automate remediation of recurring issues for improving operational stability - Lead evaluation sessions with external vendors, startups, and internal teams to drive outcomes-oriented probing of architectural designs - Lead communities of practice across Software Engineering to drive awareness and use of new technologies - Contribute to team culture of diversity, opportunity, inclusion, and respect Qualifications Required: - Formal training or certification in software engineering concepts with 5+ years of applied experience - Hands-on experience in programming, analytical & logical skills of C# & .net core - Hands-on experience in background tasks with hosted services - Experience in developing secure, high-performance, highly available, and complex APIs - Experience in AWS services like S3, SNS, SQS, Lambda, DynamoDB, and Micro-services architecture - Ability to multitask and manage multiple priorities and commitments concurrently - Proficient with at least one Relational Database like MySQL, Oracle DB, DB2, SQL Server, etc. - Good understanding of DB Design and Design Patterns for building elegant and extensible systems - Excellent understanding of OOPs concepts - Positive attitude and willingness to help others in resolving problems and issues - Experience in working with Version Control management GIT - Knowledge of modern development technologies and tools such as CI/CD, Jenkins, etc.,
ACTIVELY HIRING
posted 1 day ago

Software Engineer III - .Net Fullstack + AWS

Chase- Candidate Experience page
experience3 to 7 Yrs
location
Maharashtra, Pune
skills
  • C
  • Angular
  • AWS
  • DynamoDB
  • Microservices
  • MySQL
  • DB2
  • SQL Server
  • OOPs
  • NoSQL
  • Elasticsearch
  • Cassandra
  • Kafka
  • NET Core
  • S3
  • SNS
  • SQS
  • Lambda
  • Oracle DB
Job Description
You have an exciting opportunity to advance your software engineering career as a Software Engineer III at JPMorganChase, specifically within the Consumer and Community Banking division. In this role, you will be a key member of an agile team responsible for designing and delivering trusted market-leading technology products in a secure, stable, and scalable manner to support the firm's business objectives. **Key Responsibilities:** - Execute standard software solutions, design, development, and technical troubleshooting - Write secure and high-quality code using the syntax of at least one programming language with limited guidance - Design, develop, code, and troubleshoot with consideration of upstream and downstream systems and technical implications - Apply knowledge of tools within the Software Development Life Cycle toolchain to improve automation value - Apply technical troubleshooting to break down solutions and solve technical problems of basic complexity - Gather, analyze, and draw conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development - Learn and apply system processes, methodologies, and skills for the development of secure, stable code and systems - Contribute to the team culture of diversity, opportunity, inclusion, and respect **Qualification Required:** - Formal training or certification on software engineering concepts and 3+ years of applied experience - Hands-on experience in programming, analytical & logical skills of C# & .net core - Hands-on experience in Angular and the latest version - Hands-on experience in background tasks with hosted services - Experience in developing secure, high-performance, highly available, and complex APIs - Experience in AWS services like S3, SNS, SQS, Lambda, DynamoDB - Experience in Microservices architecture - Proficient with at least one Relational Databases like MySQL, Oracle DB, DB2, SQL Server, etc. - Good understanding of DB Design and Design Patterns for building elegant and extensible systems - Excellent understanding of OOPs concepts **Additional Company Details:** - Not available in the provided job description.,
ACTIVELY HIRING
posted 6 days ago
experience5 to 9 Yrs
location
Haryana
skills
  • Python
  • Java
  • Scala
  • AWS
  • Glue
  • EMR
  • Spark
  • DynamoDB
  • IAM
  • KMS
  • Snowflake
  • Agile
  • Spark
  • AppSync
  • Lambda
  • Step Functions
  • Event Bridge
  • CloudFormation
  • S3
  • RDS
  • SM
  • AWS Redshift
  • Azure Synapse Analytics
  • ElastiCache
  • AWS Solutions Architect
  • AWS Big Data Certification
Job Description
As an experienced professional in designing, implementing, and testing Python applications, you will be responsible for the following key tasks: - Design, implement, and test Python applications with a strong understanding of object-oriented programming, multi-tier architecture, and parallel/multi-threaded programming. - Must have programming experience with Python; familiarity with Java/Scala is considered a plus. - Utilize AWS technologies such as AppSync, Lambda, Step Functions, and Event Bridge. - Demonstrate solid experience with various AWS services including CloudFormation, S3, Glue, EMR/Spark, RDS, DynamoDB, Lambda, Step Functions, IAM, KMS, SM, etc. - Knowledge of modern cloud-native data warehouses like AWS Redshift, Snowflake, or Azure Synapse Analytics. - Implement metadata solutions using AWS non-relational data solutions like ElastiCache and DynamoDB. - Desired experience working with Python and Spark, along with Agile project experience. - Preferred certifications include AWS Solutions Architect or AWS Big Data Certification. - Thrive in a high-energy, fast-paced, entrepreneurial environment with a willingness to learn new skills and technologies. - Possess strong organizational, written, and verbal communication skills with meticulous attention to detail and the ability to work independently. - Collaborate effectively as a team player, building strong relationships across various levels of the technology and business organization. Feel free to reach out for any further clarifications or details.,
ACTIVELY HIRING
posted 5 days ago
experience5 to 9 Yrs
location
Haryana
skills
  • Python
  • Java
  • Scala
  • AWS
  • Glue
  • EMR
  • Spark
  • DynamoDB
  • IAM
  • KMS
  • Snowflake
  • Agile
  • Spark
  • AppSync
  • Lambda
  • Step Functions
  • Event Bridge
  • CloudFormation
  • S3
  • RDS
  • SM
  • AWS Redshift
  • Azure Synapse Analytics
  • ElastiCache
  • AWS Solutions Architect
  • AWS Big Data Certification
Job Description
As an experienced Python developer, you will be responsible for designing, implementing, and testing Python applications with a solid understanding of object-oriented programming, multi-tier architecture, and parallel/multi-threaded programming. Your programming expertise in Python is crucial, while knowledge of Java/Scala would be advantageous. Your experience with AWS technologies, including AppSync, Lambda, Step Functions, and Event Bridge, will be essential for this role. Key Responsibilities: - Implement AWS services such as CloudFormation, S3, Glue, EMR/Spark, RDS, DynamoDB, Lambda, Step Functions, IAM, KMS, SM, etc. - Utilize modern cloud-native data warehouses like AWS Redshift, Snowflake, or Azure Synapse Analytics. - Develop metadata solutions using AWS non-relational data solutions such as ElastiCache and DynamoDB. - Collaborate on Agile projects and work with Python and Spark. Qualifications Required: - AWS Solutions Architect or AWS Big Data Certification is preferred. - Ability to adapt and thrive in a high-energy, high-growth, fast-paced, entrepreneurial environment. - Strong organizational, written, and verbal communication skills with great attention to detail. - Highly collaborative team player who can establish strong relationships across all levels of the technology and business organization. If you possess the required skills and are willing to learn new technologies while contributing effectively to a dynamic team environment, this position is ideal for you. Join us in our mission to create innovative solutions using Python and AWS technologies.,
ACTIVELY HIRING
posted 2 days ago
experience4 to 8 Yrs
location
Tamil Nadu, Coimbatore
skills
  • Cloud Computing
  • DynamoDB
  • Triggers
  • Dashboards
  • Alarms
  • DMS
  • Subnets
  • AWS Services
  • S3
  • CloudFront
  • CloudWatch
  • Lambda Functions
  • Cognito
  • EC2 Instances
  • SQS
  • Step Functions
  • Kinesis
  • Load Balancer
  • Auto Scaling groups
  • Cloud watch logs
  • CloudWatch Metrics
  • Schedulers
  • SNS
  • VPC
  • VPC Endpoint
  • IAM Roles
  • IAM Policies
  • IAM Identity Providers
  • Event Bridge Rules
Job Description
Role Overview: You will be working as an AWS Cloud Engineer, responsible for designing and implementing cloud solutions using various AWS services. As an experienced professional with over 4 years of experience, you will be working remotely from home with a full-time schedule and weekends off. Your working hours will be from 2:00 pm to 10:00 pm. Key Responsibilities: - Design and implement cloud solutions using AWS services such as S3, CloudFront, CloudWatch, Lambda Functions, Cognito, EC2 Instances, SQS, Step Functions, Kinesis, DynamoDB, Triggers, Load Balancer, Auto Scaling groups, Dashboards, CloudWatch Logs, CloudWatch Metrics, Schedulers, SNS, Alarms, DMS, VPC, Subnets, VPC Endpoint, IAM Roles, IAM Policies, IAM Identity Providers, and Event Bridge Rules. - Ensure the security and scalability of cloud infrastructure. - Monitor and optimize cloud performance and cost. - Collaborate with cross-functional teams to deliver high-quality solutions. Qualifications Required: - Any UG / PG / Computer Engineering Graduates. - Proficiency in Cloud Computing with hands-on experience in AWS services. - Strong knowledge of cloud security best practices. - AWS certifications will be a plus. Please note that the job offers benefits such as health insurance, internet reimbursement, leave encashment, paid sick time, paid time off, and the flexibility to work from home.,
ACTIVELY HIRING
posted 1 day ago
experience6 to 10 Yrs
location
Karnataka
skills
  • AWS
  • Data integration
  • Data transformation
  • Data architecture
  • Data modeling
  • Data warehousing
  • Scripting
  • DynamoDB
  • Glue
  • Athena
  • IAM
  • Data migration
  • SDLC
  • SQL development
  • Python
  • Databricks
  • PySpark
  • S3
  • ETL processes
  • Cloudbased analytics solutions
  • Data processing pipelines
  • Data Platform
  • Cloud ecosystems
  • AWS Native services
  • MSK
  • CloudWatch
  • Lambda
  • ETL development
  • Analytics applications development
  • Unit test cases
  • PyTest
Job Description
As an AWS Data Engineer at Quest Global, you will be responsible for designing, developing, and maintaining data pipelines while ensuring data quality and integrity within the MedTech industry. Your key responsibilities will include: - Designing scalable data solutions on the AWS cloud platform - Developing data pipelines using Databricks and PySpark - Collaborating with cross-functional teams to understand data requirements - Optimizing data workflows for improved performance - Ensuring data quality through validation and testing processes To be successful in this role, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 6 years of experience as a Data Engineer with expertise in AWS, Databricks, PySpark, and S3. You should possess a strong understanding of data architecture, data modeling, and data warehousing concepts, as well as experience with ETL processes, data integration, and data transformation. Excellent problem-solving skills and the ability to work in a fast-paced environment are also essential. Required skills and experience include: - Experience in implementing Cloud-based analytics solutions in Databricks (AWS) and S3 - Scripting experience in building data processing pipelines with PySpark - Knowledge of Data Platform and Cloud (AWS) ecosystems - Working experience with AWS Native services such as DynamoDB, Glue, MSK, S3, Athena, CloudWatch, Lambda, and IAM - Expertise in ETL development, analytics applications development, and data migration - Exposure to all stages of SDLC - Strong SQL development skills - Proficiency in Python and PySpark development - Experience in writing unit test cases using PyTest or similar tools If you are a talented AWS Data Engineer looking to make a significant impact in the MedTech industry, we invite you to apply for this exciting opportunity at Quest Global.,
ACTIVELY HIRING
posted 5 days ago

AWS Backend Engineer

Ubique Systems
experience3 to 7 Yrs
location
Maharashtra, Pune
skills
  • PostgreSQL
  • MySQL
  • SQL
  • AWS
  • NoSQL
  • DynamoDB
  • AWS Aurora
  • Lambda
  • Nodejs
  • CICD
  • GitBitbucket
Job Description
As an AWS Backend Engineer in Pune, your responsibilities will include: - Strong knowledge of database internals, with a specific focus on PostgreSQL and MySQL - Expertise in Data Modeling and a deep understanding of cloud database engines, with a preference for AWS Aurora - Experience in optimizing DB queries to improve efficiency - Knowledge of AWS cloud services, particularly lambda functions - Working knowledge of lambda and API functions in Node.js - Understanding of agile methodologies like CI/CD, Application Resiliency, and Security - Familiarity with Git/Bitbucket version control - Experience with NoSQL databases, specifically DynamoDB You will be expected to leverage your technical skills to contribute to the efficient operation and development of backend systems in AWS.,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter