gcp-jobs-in-delhi, Delhi

474 Gcp Jobs in Delhi

Toggle to save search
posted 1 week ago

DevOps Engineer Cloud & CI/CD

Sharda Consultancy Services
experience3 to 6 Yrs
Salary10 - 20 LPA
location
Gurugram
skills
  • gcp
  • docker
  • cloud
  • pipelines
  • azure
  • ci
  • aws
  • kubernet
  • architectures
  • terraform
  • mlops
  • serverless
Job Description
Key Skills & Qualifications General Skill: Strong problem-solving and analytical skills Excellent communication and collaboration abilities Experience working in cross-functional teams Preferred Qualifications General:  B.Tech / M.Tech / MS / PhD in Computer Science, AI, Data Science, or related fields 410 years of experience in designing and implementing AI systems Experience in startup or fast-paced environments Strong portfolio or GitHub contributions   Excellent communication skills Experience with AWS, Azure, or GCP Infrastructure as Code (Terraform, CloudFormation) CI/CD pipelines, containerization (Docker, Kubernetes) Monitoring and scaling AI workloads Certifications in AWS/GCP/Azure Experience with MLOps pipelines Familiarity with serverless architectures
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 3 weeks ago

DevOps Engineer Cloud & CI/CD

Sharda Consultancy Services
experience3 to 7 Yrs
Salary12 - 18 LPA
location
Gurugram
skills
  • cloud computing
  • cd
  • aws
  • ci
  • gcp
  • azure devops
  • docker
  • architectures
  • serverless
  • mlops
Job Description
 Required, DevOps Engineer Cloud & CI/CD * Experience with AWS, Azure, or GCP * Infrastructure as Code (Terraform, CloudFormation) CI/CD pipelines, containerization (Docker, Kubernetes) * Monitoring and scaling AI workloads DevOps Engineer Cloud & CI/CD * Certifications in AWS/GCP/Azure * Experience with MLOps pipelines * Familiarity with serverless architectures night shift 5.5 working days First round telephonic / google meet / Tech Round by  tech head (via Google Meet)       
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago

GCP Big Data Engineer - Hiring for Reputed Organization

Acme Services Private Limited Hiring For Acme Services
experience4 to 9 Yrs
location
Delhi, Bangalore+2

Bangalore, Pune, Mumbai City

skills
  • python
  • pyspark
  • bigquery
Job Description
Hello Folks, Hiring Currently !! Job Title: GCP Data Engineer Contact: Pooja Patil (poojapatil@acme-services.in) Job Details: - Skills: - BigQuery - GCP Big Data - Airflow - PySpark/Spark- Experience: 5+ years- Locations: Bangalore, Pune, Mumbai, Gurgaon  If you're interested and match the requirements, you can share your resume with the mentioned email address.
INTERVIEW ASSURED IN 15 MINS
question

Are these jobs relevant for you?

posted 2 months ago

GCP Big Data Engineer

Acme Services Private Limited
experience6 to 11 Yrs
Salary16 - 28 LPA
location
Bangalore, Gurugram+2

Gurugram, Pune, Mumbai City

skills
  • spark
  • gcp
  • airflow
  • data
  • scala
  • big
  • query
Job Description
Job Description  We are seeking a seasoned GCP Data Analytics professional with extensive experience in Big Data technologies and Google Cloud Platform services to design and implement scalable data solutionsDesign develop and optimize data pipelines using GCP BigQuery Dataflow and Apache Airflow to support largescale data analytics Utilize the Big Data Hadoop ecosystem to manage and process vast datasets efficiently Collaborate with crossfunctional teams to gather requirements and deliver reliable data solutions Ensure data quality consistency and integrity across multiple data sources Monitor and troubleshoot data workflows to maintain high system availability and performance Stay updated with emerging trends and best practices in GCP data analytics and big data technologiesRoles and Responsibilities  Implement and manage ETL processes leveraging GCP services such as BigQuery Dataflow and Airflow Develop.  Scalable maintainable and reusable data pipelines to support business intelligence and analytics needs.  Optimize SQL queries and data models for performance and cost efficiency in BigQuery.  Integrate Hadoop ecosystem components with GCP services to enhance data processing capabilities  Automate workflow orchestration using Apache Airflow for seamless data operations  Collaborate with data engineers analysts and stakeholders to ensure alignment of data solutions with organizational goals  Participate in code reviews testing and deployment activities adhering to best practices   Mentor junior team members and contribute to continuous improvement initiatives within the data engineering team Mandatory Skills :    GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • SQL
  • Python
  • GCP
  • Glue
  • Airflow
  • DBT
  • Git
  • PySpark
  • Redshift
  • Lambda
  • EC2
  • EKSKubernetes
  • CICD
Job Description
As a Data Professional at LUMIQ, you will be part of a community that values passion and innovation, providing you the freedom to ideate, commit, and shape your career trajectory at your own pace. Our culture fosters ownership empowerment, encouraging you to drive outcomes and combine creativity with technology to revolutionize the industry through "Tech Poetry". You will work alongside industry experts, from PhDs and engineers to specialists from Banking, Insurance, NBFCs, and AMCs, who will challenge and inspire you to reach new heights. Key Responsibilities: - Solution Design & Implementation: - Translate complex business requirements into scalable cloud-based data architectures. - Design end-to-end data solutions using AWS services such as Redshift, Glue, Lambda, EC2, EKS/Kubernetes, etc. - Data Pipeline Development & Automation: - Build and maintain automated ETL/ELT pipelines using Python, PySpark, SQL, Airflow, and GCP. - Integrate with DBT for data transformation and modeling. - Data Modeling & Warehousing: - Implement efficient data models for Redshift and other cloud-based warehouses. - Optimize performance, ensure data integrity, and govern data effectively. - Monitoring, RCA & Issue Resolution: - Conduct Root Cause Analysis (RCA) for issues in data warehouse, automations, and reporting systems. - Monitor pipeline health, identify, and resolve data anomalies or system failures proactively. - Visualization & Dashboards: - Collaborate with BI and analytics teams to create dashboards and visualizations. - Design data models tailored for efficient visual analytics. - Collaboration & Documentation: - Work cross-functionally with business analysts, product owners, and engineering teams. - Maintain technical documentation for solutions and workflows. Technical Skills Requirements: - Strong proficiency in SQL, Python, and PySpark. - Hands-on experience with GCP services like Redshift, Glue, Lambda, EC2, EKS/Kubernetes. - Experience with Airflow, DBT, and data orchestration frameworks. - Solid understanding of data modeling, warehousing, and data quality practices. - Experience in dashboarding and data visualization support (Power BI/Tableau/Looker optional). - Familiarity with CI/CD and version control systems like Git. Note: The job description does not include any additional details about the company beyond the culture and work environment mentioned above.,
ACTIVELY HIRING
posted 6 days ago
experience3 to 7 Yrs
location
Delhi
skills
  • Java
  • RESTful APIs
  • SQL
  • Git
  • AWS
  • Azure
  • GCP
  • Docker
  • Nodejs
  • Reactjs
  • Nextjs
  • Microservices architecture
  • NoSQL databases
  • AgileScrum methodologies
Job Description
As a Full-Stack Developer, you will be responsible for designing, developing, and maintaining robust, scalable web applications. Your key responsibilities will include: - Developing back-end services and APIs using Java and Node.js - Building responsive and dynamic front-end interfaces using React.js and Next.js - Collaborating with UI/UX designers, product managers, and other developers to deliver high-quality features - Writing clean, maintainable, and efficient code - Integrating third-party APIs and services as needed - Ensuring application performance, quality, and responsiveness - Troubleshooting, debugging, and upgrading existing systems - Following best practices for code versioning, testing, and deployment (Git, CI/CD, etc.) To excel in this role, you should possess the following skills and qualifications: - 3-5 years of professional experience in full-stack development - Strong programming skills in Java (Spring Boot or similar frameworks) - Proficiency in Node.js for server-side development - Hands-on experience with React.js and Next.js for front-end development - Good understanding of RESTful APIs and Microservices architecture - Experience with SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB) - Familiarity with Git and version control workflows - Solid understanding of software development best practices and design patterns - Strong problem-solving and debugging skills - Excellent communication and team collaboration abilities Additionally, the following qualifications would be preferred: - Experience with cloud platforms (AWS, Azure, or GCP) - Knowledge of Docker and containerization tools - Familiarity with Agile/Scrum methodologies - Exposure to performance optimization techniques This job opportunity is available at hirist.tech.,
ACTIVELY HIRING
posted 1 day ago

Data & AI Manager

InTalent Asia
experience12 to 16 Yrs
location
Delhi
skills
  • Data Management
  • Analytics
  • Stakeholder Management
  • Strategic Thinking
  • AWS
  • GCP
  • Snowflake
  • Power BI
  • Data Architecture
  • Forecasting
  • Risk Management
  • Predictive Analytics
  • Procurement
  • Finance
  • Supply Chain
  • NLP
  • AIML
  • Cloud Platforms Azure
  • Data Stacks Azure
  • Databricks
  • Data LakeWarehouse Design
  • ERP Integration
  • BI Platforms Integration
  • AIML Models Development
  • Operational Optimization
  • Digital Twins
  • AIpowered Bots
  • Emerging Technologies Generative AI
  • RPA
  • Experimentation
  • Agility
  • Continuous Learning
Job Description
As a Data & AI Manager in the fashion/textile and apparel manufacturing & export industry in India, you will play a critical role in leading enterprise-level data and AI transformation initiatives for a prominent apparel manufacturing company. Your passion for building a data-driven organization and implementing AI solutions at scale will be instrumental in driving innovation and delivering measurable business impact through data and AI. **Key Responsibilities:** - Define and implement enterprise data strategy, governance frameworks, and data quality standards. - Oversee data architecture, data lake/warehouse design, and integration with ERP and BI platforms (e.g., SAP S/4HANA, SAC, other data sources). - Lead the development and deployment of AI/ML models for forecasting, risk management, and operational optimization. - Drive initiatives such as digital twins, AI-powered bots, and predictive analytics for procurement, finance, and supply chain. - Manage cross-functional data and AI programs, ensuring alignment with strategic goals and timely delivery. - Collaborate with IT, business improvement, and factory teams across Hong Kong, Bangladesh, and India. - Build and mentor a high-performing team of data scientists, engineers, and analysts. - Act as a trusted advisor to senior leadership, translating business needs into data-driven solutions. - Evaluate emerging technologies (e.g., generative AI, NLP, RPA) and drive their adoption. - Promote a culture of experimentation, agility, and continuous learning. **Qualifications:** - Bachelors or Masters degree in Data Science, Computer Science, Engineering, or related field. - Minimum 12 years of experience in data management, analytics, and AI/ML. - At least 5 years in leadership roles. - Proven track record in delivering enterprise-scale data and AI solutions. - Strong knowledge of modern data stacks (e.g., Azure, Snowflake, Databricks, Power BI). - Excellent communication, stakeholder management, and strategic thinking skills. - Certifications in cloud platforms (Azure, AWS, GCP) and AI/ML frameworks are a plus. If you are ready to be part of a transformation journey where data and AI drive the future, this role offers you the opportunity to make a significant impact in the industry.,
ACTIVELY HIRING
posted 2 months ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • DevOps
  • GCP
  • Google Cloud Platform
  • Cloud Storage
  • Monitoring
  • Troubleshooting
  • Communication Skills
  • CICD
  • GitHub Actions
  • Compute Engine
  • Kubernetes Engine
  • Telephony Integration
  • Application Performance
  • Enterprise SRE
Job Description
Role Overview: As a DevOps GCP, your primary responsibility will be to design, implement, and maintain scalable CI/CD pipelines using GitHub Actions and Google Cloud Platform (GCP). You will focus on developing and optimizing infrastructure on GCP, specifically working with Compute Engine, Kubernetes Engine, and Cloud Storage. Collaboration with development teams to streamline software deployment processes will also be a key aspect of your role. Key Responsibilities: - Design, implement, and maintain scalable CI/CD pipelines using GitHub Actions and GCP. - Develop and optimize infrastructure on GCP, including Compute Engine, Kubernetes Engine, and Cloud Storage. - Collaborate with development teams to ensure efficient software deployment processes. - Implement and manage telephony integration for cloud-based communication systems (good to have). - Monitor and troubleshoot infrastructure and application performance issues. - Interface with Enterprise SRE teams for application security and reliability tasks. Qualifications Required: - Ability to handle multiple strategic and critical projects simultaneously. - Effective interpersonal, coaching, team-building, and communication skills. - Experience running/coordinating meetings and major incident calls. - Desire for continuous improvement and ownership skills. - Flexibility to support a 24*7 environment. - Ability to communicate complex technology to non-tech audiences effectively.,
ACTIVELY HIRING
posted 3 days ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • Machine Learning
  • Deep Learning
  • Natural Language Processing
  • Transformers
  • Python
  • GCP
  • Azure
  • Linear Algebra
  • Optimization
  • Reinforcement Learning
  • GANs
  • VAEs
  • Diffusion models
  • TensorFlow
  • PyTorch
  • Hugging Face
  • OpenAIs GPT
  • Keras
  • Cloud Platforms AWS
  • Probability Theory
  • Selfsupervised Learning
  • AI Ethics
  • Bias Mitigation
Job Description
Role Overview: As a skilled Generative AI Developer, you will be a valuable member of the team contributing to the advancement of cutting-edge generative models. Your primary focus will be on crafting AI systems capable of generating innovative and human-like content spanning various domains such as natural language, images, videos, and music synthesis. Key Responsibilities: - Develop and refine generative models utilizing techniques such as GANs, VAEs, Transformers, and Diffusion models. - Work on tasks involving text generation, image generation, music/sound generation, video creation, and other creative AI applications. - Design, construct, and deploy models using frameworks like TensorFlow, PyTorch, Hugging Face, OpenAI's GPT, or similar tools. - Optimize AI models for performance, accuracy, and scalability in real-world production environments. - Stay updated on the latest trends, research papers, and breakthroughs in the field of generative AI. - Collaborate with cross-functional teams to integrate models into production systems and ensure seamless operations. - Perform data preprocessing, augmentation, and design pipelines to enhance model training input quality. - Document code, processes, and model outputs for team-wide visibility and knowledge sharing. Qualification Required: - Bachelor's or Master's degree in Computer Science or related fields. - 3-4 years of hands-on experience in developing Generative AI solutions. - Strong understanding of deep learning algorithms, particularly in generative models like GANs, VAEs, Diffusion models, or large-scale language models like GPT. - Hands-on experience with machine learning frameworks and libraries such as TensorFlow, PyTorch, or Hugging Face. - Proficient programming skills in Python, including deep learning libraries like TensorFlow, Keras, and PyTorch. - Familiarity with cloud platforms (AWS, GCP, or Azure) for model training and deployment. - Solid mathematical and statistical knowledge, especially in probability theory, linear algebra, and optimization. - Experience in large-scale model training, fine-tuning, or distributed computing. - Knowledge of reinforcement learning and self-supervised learning. - Understanding of AI ethics, bias mitigation, and interpretability in generative models.,
ACTIVELY HIRING
posted 2 weeks ago
experience7 to 11 Yrs
location
Noida, Uttar Pradesh
skills
  • Data engineering
  • Automated testing
  • Python
  • Java
  • SQL
  • Data modeling
  • GCP technologies
  • Dataproc
  • Cloud Composer
  • PubSub
  • BigQuery
  • CICD
  • Infrastructureascode
Job Description
Role Overview: As a Data Engineering Lead, you will provide technical leadership and mentorship to a team of data engineers. Your primary responsibility will be to design, architect, and implement highly scalable, resilient, and performant data pipelines. Your expertise in GCP technologies like Dataproc, Cloud Composer, Pub/Sub, and BigQuery will be highly beneficial in this role. Key Responsibilities: - Guide the team in adopting best practices for data engineering, including CI/CD, infrastructure-as-code, and automated testing. - Conduct code reviews, design reviews, and provide constructive feedback to team members. - Develop and maintain robust data pipelines to ingest, process, and transform large volumes of structured and unstructured data. - Implement data quality checks and monitoring systems to ensure data accuracy and integrity. - Design and implement secure and scalable data storage solutions. - Manage and optimize cloud infrastructure costs related to data engineering workloads. - Effectively communicate technical designs and concepts to both technical and non-technical audiences. - Collaborate with cross-functional teams, product managers, and business stakeholders to deliver data solutions that meet their needs. Qualifications Required: - Bachelor's or Master's degree in Computer Science, Engineering, or a related field. - 7+ years of experience in data engineering and Software Development. - 7+ years of experience coding in SQL and Python/Java. - 3+ years of hands-on experience building and managing data pipelines in a cloud environment like GCP. - Strong programming skills in Python or Java, with experience in developing data-intensive applications. - Expertise in SQL and data modeling techniques for both transactional and analytical workloads. - Experience with CI/CD pipelines and automated testing frameworks. - Excellent communication, interpersonal, and problem-solving skills. - Experience leading or mentoring a team of engineers.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
All India, Gurugram
skills
  • Data Engineering
  • Google Cloud Platform
  • SQL
  • Python
  • Git
  • Docker
  • Kubernetes
  • BigQuery
  • Dataflow
  • Dataproc
  • PubSub
  • Composer
  • Data Lake
  • Data Warehouse
  • CICD
  • Terraform
Job Description
As a Senior Data Engineer GCP at EY FABERNOVEL, you will be responsible for designing, developing, and industrializing data cloud solutions on Google Cloud Platform (GCP) to support the technological and business transformations of clients. You will be involved in the entire project lifecycle including architecture, ingestion, modeling, deployment, and data governance following DataOps and DevOps principles. **Key Responsibilities:** - Design and develop efficient ETL/ELT pipelines on GCP using BigQuery, Dataflow, Dataproc, Pub/Sub, Composer. - Model and maintain scalable and optimized data warehouses and data lakes. - Define and implement reliable and secure cloud data architectures incorporating best DataOps practices. - Collaborate closely with business and technical teams to co-build data-driven solutions. - Ensure data quality, security, and governance in complex cloud environments. - Stay updated on technological advancements and share best practices within the Data & AI community. **Qualifications Required:** - Bachelor's degree in Computer Science or equivalent. - 3 to 5 years of experience in Data Engineering on Google Cloud Platform. - Proficiency in BigQuery, Dataflow, Dataproc, Pub/Sub, Composer, SQL, and Python. - Familiarity with modern data architectures (Data Lake, Data Warehouse, Data Mesh). - Knowledge of CI/CD tools, Git, Docker, Kubernetes, Terraform is a plus. - Professional proficiency in English. Joining EY FABERNOVEL will offer you: - High-impact missions for large clients in France and internationally. - Cutting-edge technological environment (GCP, DataOps, DevOps). - Continuous cloud training and certifications. - Career progression towards architectural or technical leadership roles. - A culture of sharing, diversity, and collective learning. At EY, a job is more than just employment. It is about providing a remarkable experience that will accompany you throughout your career: - Working with top international clients in diverse and multicultural teams. - Access to a comprehensive and personalized training program with a dedicated mentor. - Being part of international competency and sectoral communities comprising top professionals in Europe. - Contributing to a zero-carbon consultancy, with EY prioritizing CSR in its strategy. - Opportunities for geographical mobility and exposure to different cultures through the Mobility4U program. - Engaging in internal networks on ecology, diversity, and inclusion themes, and becoming a patron through the foundation to support impactful causes. - Participation in various EY events (Entrepreneur of the Year Award, Woman In, T-DAY, Sustainable Futures, etc.). EY is committed to diversity, equity, and inclusion, considering all applications equally, including those from individuals with disabilities. As a Senior Data Engineer GCP at EY FABERNOVEL, you will be responsible for designing, developing, and industrializing data cloud solutions on Google Cloud Platform (GCP) to support the technological and business transformations of clients. You will be involved in the entire project lifecycle including architecture, ingestion, modeling, deployment, and data governance following DataOps and DevOps principles. **Key Responsibilities:** - Design and develop efficient ETL/ELT pipelines on GCP using BigQuery, Dataflow, Dataproc, Pub/Sub, Composer. - Model and maintain scalable and optimized data warehouses and data lakes. - Define and implement reliable and secure cloud data architectures incorporating best DataOps practices. - Collaborate closely with business and technical teams to co-build data-driven solutions. - Ensure data quality, security, and governance in complex cloud environments. - Stay updated on technological advancements and share best practices within the Data & AI community. **Qualifications Required:** - Bachelor's degree in Computer Science or equivalent. - 3 to 5 years of experience in Data Engineering on Google Cloud Platform. - Proficiency in BigQuery, Dataflow, Dataproc, Pub/Sub, Composer, SQL, and Python. - Familiarity with modern data architectures (Data Lake, Data Warehouse, Data Mesh). - Knowledge of CI/CD tools, Git, Docker, Kubernetes, Terraform is a plus. - Professional proficiency in English. Joining EY FABERNOVEL will offer you: - High-impact missions for large clients in France and internationally. - Cutting-edge technological environment (GCP, DataOps, DevOps). - Continuous cloud training and certifications. - Career progression towards architectural or technical leadership roles. - A culture of sharing, diversity, and collective learning. At EY, a job is more than just employment. It is about providing a remarkable experience that will accompany you throughout your career: - Working with top international clients in diverse and multicultural teams. - Access to a comprehensive and personalized training program with a dedicated mentor. - Being part of international competency and sectoral communities comprising t
ACTIVELY HIRING
posted 1 day ago
experience4 to 8 Yrs
location
Noida, Uttar Pradesh
skills
  • Java
  • Bootstrap
  • Javascript
  • HTML
  • CSS
  • AJAX
  • JSON
  • MySQL
  • SQL Server
  • PostgreSQL
  • Windows
  • Mac
  • Bower
  • NPM
  • Grunt
  • JIRA
  • GIT
  • GITHUB
  • BITBUCKET
  • OAuth
  • MongoDB
  • AWS
  • Azure
  • microservices
  • GCP
  • RESTful APIs
  • Nodejs
  • Reactjs
  • JS FrameworksLibraries
  • Java Spring Boot
  • Nextjs
  • HTMLHTML5
  • CSSSCSS
  • Tailwind CSS
  • VScode
  • Gulp
  • AngularCLI
  • Context API
  • Zustand
  • Redux
  • SCSS
  • JWT
Job Description
As a skilled Full Stack Developer with over 4 years of experience in Java, Node.js, and React.js, you will be responsible for developing scalable backend services and designing user-friendly frontend applications for cutting-edge web applications. Your role will involve collaborating with cross-functional teams to deliver high-quality software solutions. **Key Responsibilities:** - Backend development using Node.js and Java, including developing and maintaining RESTful APIs, designing database schemas, implementing authentication and authorization, ensuring high performance and responsiveness of backend services, and working with microservices architecture and cloud deployment. - Frontend development using React.js, involving building dynamic and responsive UI components, implementing state management, optimizing performance and user experience, working with RESTful APIs, and ensuring cross-browser compatibility and mobile responsiveness. - Participating in code reviews, debugging, troubleshooting, and collaborating with UI/UX designers, backend developers, and DevOps teams. - Following Agile methodologies for project execution and writing clean, maintainable, and well-documented code. **Qualifications Required:** - Knowledge of JS Frameworks/Libraries such as React.js, Java Spring Boot, Node.js Next.js, and Bootstrap. - Proficiency in languages like Javascript, Java, HTML, and CSS. - Familiarity with web technologies like HTML/HTML5, CSS/SCSS, Tailwind CSS, AJAX, and JSON. - Experience with databases such as MySQL, SQL Server, or PostgreSQL. - Operating systems like Windows and Mac, and other tools like VScode, Bower, NPM, Gulp, Grunt, Angular-CLI, and JIRA. - Well-versed in version controlling tools like GIT, GITHUB, and BITBUCKET. Your skills in Bower, React.js, GIT, AJAX, Tailwind CSS, OAuth, NPM, MongoDB, Grunt, Java, Context API, CSS, Spring Boot, HTML, JIRA, Bitbucket, AWS, Angular-CLI, Zustand, JSON, PostgreSQL, Azure, Redux, MySQL, microservices, SCSS, GITHUB, JWT, Node.js, Gulp, VScode, RESTful APIs, and GCP will be essential for excelling in this role.,
ACTIVELY HIRING
posted 3 weeks ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • Data security
  • Operational efficiency
  • Data integration
  • Data Platform Engineering
  • Cloudbased data infrastructure
  • Infrastructure as Code
  • Data storage solutions
  • Data processing frameworks
  • Data orchestration tools
Job Description
Role Overview: As a Data Platform Engineer at Capgemini, you will specialize in designing, building, and maintaining cloud-based data infrastructure and platforms for data-intensive applications and services. Your role will involve developing Infrastructure as Code, managing foundational systems and tools for efficient data storage, processing, and management. You will be responsible for architecting robust and scalable cloud data infrastructure, selecting and implementing suitable storage solutions, data processing frameworks, and data orchestration tools. Additionally, you will ensure the continuous evolution of the data platform to meet changing data needs, leverage technological advancements, and maintain high levels of data security, availability, and performance. Your tasks will include creating and managing processes and tools to enhance operational efficiency, optimizing data flow, and ensuring seamless data integration to enable developers to build, deploy, and operate data-centric applications efficiently. Key Responsibilities: - Actively participate in the professional data platform engineering community, share insights, and stay up-to-date with the latest trends and best practices. - Make substantial contributions to client delivery, particularly in the design, construction, and maintenance of cloud-based data platforms and infrastructure. - Demonstrate a sound understanding of data platform engineering principles and knowledge in areas such as cloud data storage solutions (e.g., AWS S3, Azure Data Lake), data processing frameworks (e.g., Apache Spark), and data orchestration tools. - Take ownership of independent tasks, display initiative and problem-solving skills when confronted with intricate data platform engineering challenges. - Commence leadership roles, which may encompass mentoring junior engineers, leading smaller project teams, or taking the lead on specific aspects of data platform projects. Qualifications Required: - Strong grasp of the principles and practices associated with data platform engineering, particularly within cloud environments. - Proficiency in specific technical areas related to cloud-based data infrastructure, automation, and scalability. Company Details: Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With a responsible and diverse group of 340,000 team members in more than 50 countries, Capgemini has a strong over 55-year heritage. Trusted by its clients to unlock the value of technology, Capgemini delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by market-leading capabilities in AI, generative AI, cloud and data, combined with deep industry expertise and partner ecosystem.,
ACTIVELY HIRING
posted 4 days ago
experience7 to 11 Yrs
location
Noida, Uttar Pradesh
skills
  • Security
  • Networking
  • Automation
  • Migration Planning
  • Hybrid Cloud
  • Performance Optimization
  • Cost Optimization
  • Incident Management
  • Troubleshooting
  • Ansible
  • AWS
  • Azure
  • Docker
  • Kubernetes
  • Encryption
  • Identity Management
  • Python
  • Bash
  • Go
  • GCP Architect
  • Cloud Architecture
  • Infrastructure as Code
  • CICD
  • DevOps Integration
  • Stakeholder Collaboration
  • Terraform
  • Microservices Architecture
  • VPC Design
  • Cloud Interconnect
  • Security Best Practices
  • Anthos
  • Istio
  • Service Mesh
Job Description
Role Overview: At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Key Responsibilities: - Cloud Strategy & Architecture: Design and implement scalable, secure, and cost-effective GCP architectures for a multi-cloud environment. - Migration Planning: Develop and execute migration strategies for applications, data, and infrastructure from on-premise or other cloud platforms (AWS/Azure) to GCP. - Infrastructure as Code (IaC): Utilize Terraform, Ansible, or other IaC tools for automated provisioning and management of cloud resources. - Security & Compliance: Ensure cloud environments adhere to industry security best practices, compliance standards (e.g., ISO, SOC, HIPAA), and Google Cloud security frameworks. - CI/CD & DevOps Integration: Work with DevOps teams to integrate CI/CD pipelines using Azure DevOps, GitHub Actions, or Jenkins for cloud deployments. - Networking & Hybrid Cloud: Design and implement hybrid and multi-cloud networking solutions, including VPNs, interconnects, and service mesh (Anthos, Istio). - Performance & Cost Optimization: Monitor, optimize, and provide recommendations for cloud resource utilization, cost efficiency, and performance enhancements. - Stakeholder Collaboration: Work closely with business, security, and engineering teams to align cloud solutions with organizational goals. - Incident Management & Troubleshooting: Provide technical leadership for incident resolution, root cause analysis, and continuous improvement in cloud operations. Qualifications Required: - 7-11 years of experience. - Strong hands-on experience with GCP services (Compute Engine, GKE, Cloud Functions, BigQuery, IAM, Cloud Armor, etc.). - Familiarity with AWS and/or Azure services and cross-cloud integrations. - Proficiency in Terraform, Ansible, or other IaC tools. - Experience with containerization (Docker, Kubernetes) and microservices architecture. - Strong networking skills, including VPC design, Cloud Interconnect, and hybrid cloud solutions. - Understanding of security best practices, encryption, and identity management in a multi-cloud setup. - Experience in Migration from on-prem to GCP or hybrid cloud architectures. - Experience with Anthos, Istio, or service mesh technologies. - Strong scripting skills in Python, Bash, or Go for automation. Company Details: EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
ACTIVELY HIRING
posted 1 week ago

JAVA Developer ( GCP )

Golden Opportunities
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • GCP
  • JAVA
  • MICROSERVICES
  • SPRING BOOT
Job Description
Role Overview: As a Java Developer specializing in Google Cloud Platform (GCP), your main responsibility will be to design, develop, and deploy Java / Spring Boot Microservices. You will work on implementing and maintaining RESTful APIs with high scalability and performance. Your role will involve working closely with GCP services such as Compute Engine, App Engine, Cloud Run, Pub/Sub, Cloud Functions, Big Query, Cloud Storage, and GKE (Google Kubernetes Engine). Additionally, you will optimize and troubleshoot microservice performance issues, work with Docker, Kubernetes, and CI/CD pipelines, and collaborate with cross-functional teams including QA, DevOps, and Product teams to deliver end-to-end solutions. It will be crucial for you to ensure best practices in coding, testing, and cloud deployment, participate in code reviews, and contribute to technical improvements. Key Responsibilities: - Design, develop, and deploy Java / Spring Boot Microservices - Implement and maintain RESTful APIs with high scalability and performance - Work with various Google Cloud Platform (GCP) services - Compute Engine, App Engine, Cloud Run - Pub/Sub, Cloud Functions - Big Query, Cloud Storage - GKE (Google Kubernetes Engine) - Optimize and troubleshoot microservice performance issues - Work with Docker, Kubernetes, and CI/CD pipelines - Collaborate with cross-functional teams (QA, DevOps, Product) to deliver end-to-end solutions - Ensure best practices in coding, testing, and cloud deployment - Participate in code reviews and contribute to technical improvements Qualifications Required: - Bachelor Degree in a relevant field - 6-9 years of experience in Java development - Strong skills in GCP, Java, Microservices, and Spring Boot (Note: The additional details of the company were not provided in the Job Description),
ACTIVELY HIRING
posted 2 months ago
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • google cloud platform
  • python
  • aws
  • design
  • cloud
  • google cloud
Job Description
As a Cloud Data Engineer / Architect with expertise in Python and Google Cloud Platform (GCP), your role will involve designing, developing, and researching cloud-based data applications. You will work on scalable solutions, optimize Python-based applications, and collaborate with cross-functional teams to deliver high-quality results. Your responsibilities will include: - Designing and implementing scalable cloud-based data solutions, focusing on Google Cloud Platform. - Developing and optimizing Python-based applications and services for cloud environments. - Conducting research and building proof-of-concepts for new solutions, tools, and frameworks. - Collaborating with teams to solve complex problems and ensure best practices in performance, security, and scalability. - Supporting architectural discussions, documentation, and reviews. Qualifications required for this role include: - Graduation/Postgraduation in Computer Science or related field from Tier-1 institutes (IIT, IIIT, NIT, or equivalent). - 5-7 years of experience in cloud-based data engineering or solution design roles. Technical skills needed: - Strong expertise in Python programming for cloud-based data applications. - Hands-on experience with Google Cloud Platform (GCP); candidates with AWS/Azure experience willing to work in GCP are welcome. - Understanding of cloud data architecture, storage, and processing. Other essential attributes: - R&D mindset, ability to work independently on innovative solutions and POCs. - Strong analytical and problem-solving skills. - Excellent communication and collaboration skills. - Flexibility, hard-working nature, and willingness to work in a dynamic, fast-paced environment. - Open to a hybrid working model from the Noida office. In this role, you will have the opportunity to work on cutting-edge technologies like Google Cloud, Python, AWS, and design cloud-based solutions. Your contributions will be crucial in ensuring the performance, security, and scalability of the systems you design and implement.,
ACTIVELY HIRING
posted 1 week ago
experience5 to 9 Yrs
location
Noida, Uttar Pradesh
skills
  • DevOps
  • Ansible
  • Docker
  • Kubernetes
  • Jenkins
  • GitLab
  • Monitoring
  • Logging
  • Bash
  • Python
  • Ruby
  • Google Cloud Platform GCP
  • Terraform
  • CICD
  • CircleCI
Job Description
Role Overview: As a GCP DevOps Engineer at Prismberry, you will be an essential part of the team responsible for designing, implementing, and optimizing cloud-based infrastructure and CI/CD pipelines on the Google Cloud Platform. Your role will involve collaborating with cross-functional teams to ensure efficient deployment, monitoring, and management of cloud applications. If you are confident, curious, and determined to solve problems effectively in the cloud, this position offers you the opportunity to work in a collaborative and innovative environment that values clarity, confidence, and getting things done through cloud technology. Key Responsibilities: - Design, implement, and optimize cloud infrastructure on GCP using services such as Compute Engine, Kubernetes Engine, Cloud Functions, and more. - Automate infrastructure provisioning and configuration management using tools like Terraform, Ansible, or Deployment Manager. - Build and optimize CI/CD pipelines for seamless deployment and release management of cloud applications. - Collaborate with development teams to ensure application scalability, reliability, and performance on the cloud platform. - Implement and manage monitoring, logging, and alerting systems to ensure the health and availability of cloud services. - Continuously improve DevOps processes and practices, implementing industry best practices and leveraging GCP native tools and services. - Provide technical guidance and support to teams for efficient utilization of GCP DevOps tools and services. Qualifications Required: - Bachelor's degree in Computer Science, Engineering, or a related field. - Strong experience as a DevOps Engineer, with expertise in Google Cloud Platform (GCP) services. - In-depth knowledge of GCP services, including Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, and related technologies. - Proficiency in infrastructure-as-code tools such as Terraform, Ansible, or Deployment Manager. - Experience with containerization and orchestration using Docker and Kubernetes. - Strong understanding of CI/CD principles and experience with tools like Jenkins, GitLab CI/CD, or CircleCI. - Familiarity with monitoring and logging tools such as Stackdriver, Prometheus, or ELK stack. - Excellent scripting skills in languages like Bash, Python, or Ruby. - Strong problem-solving skills and ability to troubleshoot complex infrastructure and deployment issues. - Passion for DevOps and GCP, thriving in a collaborative and fast-paced environment. - Enjoy solving puzzles, understanding how things work, and constantly learning in a demanding environment. Additionally, Prismberry offers competitive salary and performance-based incentives, opportunities for career growth and advancement, a collaborative and innovative work environment, cutting-edge technology solutions, and a strong commitment to employee development and well-being.,
ACTIVELY HIRING
posted 2 weeks ago

JAVA Developer (GCP)

Golden Opportunities
experience6 to 10 Yrs
location
Noida, Uttar Pradesh
skills
  • GCP
  • JAVA
  • MICROSERVICES
  • SPRING BOOT
Job Description
As a Java Developer with experience in Google Cloud Platform (GCP), your role will involve designing, developing, and deploying Java / Spring Boot Microservices. You will be responsible for implementing and maintaining RESTful APIs with high scalability and performance. Additionally, you will work with various GCP services such as Compute Engine, App Engine, Cloud Run, Pub/Sub, Cloud Functions, Big Query, Cloud Storage, and GKE (Google Kubernetes Engine). Your duties will also include optimizing and troubleshooting microservice performance issues, working with Docker, Kubernetes, and CI/CD pipelines, and collaborating with cross-functional teams (QA, DevOps, Product) to deliver end-to-end solutions. It is essential to ensure best practices in coding, testing, and cloud deployment, participate in code reviews, and contribute to technical improvements. Key Responsibilities: - Design, develop, and deploy Java / Spring Boot Microservices - Implement and maintain RESTful APIs with high scalability and performance - Work with various Google Cloud Platform (GCP) services - Optimize and troubleshoot microservice performance issues - Collaborate with cross-functional teams to deliver end-to-end solutions - Ensure best practices in coding, testing, and cloud deployment - Participate in code reviews and contribute to technical improvements Qualifications Required: - Bachelor's Degree in a relevant field - 6-9 years of experience in Java development - Experience with GCP, microservices, and Spring Boot - Strong skills in GCP, Java, microservices, and Spring Boot Please note the additional details about the company or any specific information beyond the job description were not provided in the given text.,
ACTIVELY HIRING
posted 5 days ago
experience4 to 8 Yrs
location
Noida, Uttar Pradesh
skills
  • Python
  • DevOps
  • AWS
  • Azure
  • GCP
  • MEAN MERN MEVN
  • LAMP PHP
  • Nodejs
  • React
  • Vuejs
  • Mobile Application Native
  • AIML Services
Job Description
As an Upwork Bidder at our company, your role will involve generating qualified leads and converting opportunities for non-Salesforce technology services across global clients. Your responsibilities will include: - Lead Generation & Bidding: - Identifying relevant projects on Upwork and other freelancing platforms. - Submitting high-quality proposals tailored to client requirements. - Writing convincing cover letters, scope summaries, and project breakdowns. - Maintaining a high response and conversion rate. - Client Communication: - Communicating with potential clients through Upwork chat, video calls, or emails. - Understanding client needs, pain points, timelines, budget, and technical requirements. - Scheduling meetings between clients and internal technical teams. - Requirement Analysis: - Analyzing RFPs and client documents to prepare accurate estimates and project scopes. - Coordinating with developers, designers, and project managers to clarify deliverables. - Business Development & Coordination: - Collaborating with internal teams to prepare proposals, presentations, and demos. - Following up aggressively with clients to close deals. - Maintaining a pipeline of qualified opportunities and tracking progress. - Market Research & Strategy: - Researching trending technologies, client personas, and global IT outsourcing demands. - Suggesting new bidding strategies to increase win rates. - Helping expand service offerings across multiple non-Salesforce tech stacks. Qualifications and Skills Required: - Proven experience as an Upwork Bidder, Business Development Executive, or Pre-Sales Consultant. - Strong understanding of non-Salesforce tech stacks like MEAN/MERN/MEVN, LAMP/PHP, Python, Node.js, React, Vue.js, Mobile Application (Native), DevOps, AWS, Azure, GCP, AI/ML Services. - Excellent written and spoken English communication skills. - Ability to understand technical requirements and draft professional proposals. - Strong negotiation and client-handling skills. - Experience working with international clients (US, UK, EU, Middle East preferred). - Goal-oriented, self-driven, and proactive in closing deals. Preferred Qualifications: - Experience working in IT consulting or a software development agency. - Ability to understand project estimation, timelines, and resource planning. - Ability to create pitch decks, business presentations, and sales collateral. Please note that the company is actively involved in various interview levels and has multiple job openings in different locations.,
ACTIVELY HIRING
posted 1 week ago
experience3 to 7 Yrs
location
Noida, Uttar Pradesh
skills
  • PHP
  • AWS
  • GCP
  • Docker
  • Kubernetes
  • ReactJS
  • NodeJS
Job Description
As a Technology Specialist at Republic Media, you will be responsible for leading development and innovation across digital platforms. Your strong background in full-stack development, cloud management (AWS & GCP), and team leadership will be crucial in building scalable and secure web applications. Key Responsibilities: - Mentor developers for end-to-end product and application development. - Conduct code analysis and code reviews for full-stack development (Frontend, Backend, Middleware). - Architect, design, and deploy scalable web applications using ReactJS, NodeJS, and PHP (Laravel). - Manage AWS & GCP infrastructure (EC2, S3, RDS, Cloud Storage, Compute Engine, App Engine). - Oversee CI/CD processes, Docker deployments, and DevOps pipelines using Kubernetes. - Optimize system performance, uptime, and security. - Ensure code quality and project delivery by mentoring developers. Technical Skills: - Frontend: ReactJS, JavaScript, HTML, CSS - Backend: NodeJS, PHP (Laravel) - Database: MongoDB, MySQL, PostgreSQL - Cloud: AWS, GCP - Tools: Docker, Git, JIRA, Nginx, Redis Qualifications: - MCA / B.Tech / M.Tech in Computer Science, IT, or related field Please note that this is a full-time, on-site position located at Republic Media Network in Noida Sector 158. (ref:hirist.tech),
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter