bigtable-jobs-in-arcot

45 Bigtable Jobs in Arcot

Toggle to save search
posted 2 months ago
experience6 to 10 Yrs
location
Hyderabad, Telangana
skills
  • Java
  • OOAD
  • JEE
  • Spring
  • Spring Boot
  • Hibernate
  • Oracle
  • PostgreSQL
  • BigTable
  • NoSQL
  • Git
  • IntelliJ IDEA
  • Data Flow
  • Python
  • Jira
  • JSON
  • XML
  • YAML
  • Ruby
  • Perl
  • C
  • C
  • Docker
  • Kubernetes
  • Spring Cloud
  • CloudSQL
  • BigQuery
  • PubSub
  • Agile development methodology
  • Multitenant cloud technologies
  • Terraform
  • Nodejs
  • bash scripting languages
Job Description
Role Overview: You will be a part of the team working on a significant software project for a world-class company that provides M2M/IoT 4G/5G modules to industries such as automotive, healthcare, and logistics. Your role will involve contributing to the development of end-user module firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and analyzing and estimating customer requirements. Key Responsibilities: - Implementing new features and delivering production-ready code - Creating technical documentation and system diagrams - Troubleshooting and fixing debugging reports - Optimizing performance of the software Qualification Required: - Minimum 6 years of experience in developing and designing software applications using Java - Strong understanding of core computer science fundamentals, including data structures, algorithms, and concurrent programming - Proficiency in OOAD and design principles, implementing microservices architecture using various technologies - Experience in working in Native & Hybrid Cloud environments and with Agile development methodology - Excellent collaboration and communication skills to work effectively across product and technology teams - Ability to translate strategic priorities into scalable and user-centric solutions - Detail-oriented problem solver with strong analytical skills - Proficiency in Java, Java IDEs, Java EE application servers, Git, Maven, scripting languages, JSON, XML, YAML, and Terraform - Preferred skills include Agile Scrum methodologies, continuous integration systems like Jenkins or GitHub CI, SAFe methodologies, security solutions design, and development experience in various languages and technologies such as Spring, Spring Boot, C, C++, Oracle, Docker, and Kubernetes. About the Company: GlobalLogic is a trusted digital engineering partner to leading companies worldwide, offering opportunities for continuous learning and development, meaningful work, balance, and flexibility. The company values a culture of caring, learning and growth opportunities, interesting and impactful projects, balance and flexibility, and integrity and trust.,
ACTIVELY HIRING

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 3 weeks ago

GCP Solution Architect

InOpTra Digital
experience15 to 19 Yrs
location
Karnataka
skills
  • Application Development
  • DevOps
  • Security
  • Networking
  • Monitoring
  • GCP Core Services
  • Data Analytics
Job Description
As a GCP Solution Architect with 15+ years of experience in cloud infrastructure, data solutions, application development, modernization, and DevOps practices, your role involves a strategic approach to architecting complex cloud solutions and driving business value for clients. - **Solution Architecture & Design** - Design and architect end-to-end GCP cloud solutions encompassing infrastructure, data platforms, and application architecture. - Lead application modernization initiatives, including migration strategies to GCP and containerization. - Develop cloud-native architecture using GCP services like GKE, Cloud Run, App Engine, Cloud Functions, and Anthos. - Create technical roadmaps and architecture blueprints in alignment with business objectives. - Ensure solutions adhere to best practices for security, scalability, reliability, and cost optimization. - **Pre-Sales & Client Engagement** - Lead technical discussions in pre-sales engagements and client meetings. - Conduct discovery sessions to comprehend client requirements and pain points. - Develop compelling technical proposals, solution presentations, and proof-of-concepts (POCs). - Provide effort estimation and solution sizing for proposed architectures. - Address technical objections and offer clarifications during sales cycles. - **SOW Development & Documentation** - Write detailed Statements of Work (SOW) outlining scope, deliverables, timelines, and assumptions. - Define project phases, milestones, and success criteria. - Document technical architecture, design decisions, and implementation approaches. - Create high-level and low-level design documents and architecture decision records (ADRs). - **Data & Analytics** - Design data architecture using BigQuery, Cloud SQL, Cloud Spanner, Bigtable, and Firestore. - Architect data lakes, data warehouses, and data pipelines on GCP. - Implement ETL/ELT solutions with Dataflow, Dataproc, Cloud Composer, and Pub/Sub. - Design analytics and business intelligence solutions, along with data governance and security frameworks. - **DevOps & Automation** - Design and implement CI/CD pipelines using Cloud Build, Jenkins, GitLab CI, or similar tools. - Architect infrastructure-as-code solutions with Terraform, Deployment Manager, or Pulumi. - Implement monitoring and observability solutions using Cloud Monitoring, Cloud Logging, and Cloud Trace. - Design containerization and orchestration strategies using Docker and Kubernetes (GKE). - **Leadership & Collaboration** - Mentor junior architects and technical teams, collaborating across functions. - Provide technical leadership through project lifecycles, staying current with GCP innovations and industry trends. - Participate in architecture review boards and governance discussions. Your qualifications should include 15+ years of IT experience, with a focus on cloud technologies, hands-on experience with GCP, expertise in pre-sales activities, SOW creation, and technical documentation, as well as experience in enterprise cloud migrations and transformations. Preferred qualifications include certifications such as Google Cloud Professional Cloud Architect, Data Engineer, DevOps Engineer, and Security Engineer, along with soft skills like excellent communication, stakeholder management, analytical thinking, and problem-solving abilities. Experience with multi-cloud architecture, application modernization patterns, data science, ML/AI workloads, and compliance frameworks are advantageous.,
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
All India
skills
  • Python
  • JS
  • Angular
  • Java
  • C
  • MySQL
  • Elastic Search
  • Elasticsearch
  • Kafka
  • Apache Spark
  • Logstash
  • Hadoop
  • Hive
  • Kibana
  • Athena
  • Presto
  • BigTable
  • AWS
  • GCP
  • Azure
  • Unit Testing
  • Continuous Integration
  • Agile Methodology
  • React
  • Tensorflow
  • Deployment Practices
Job Description
As a passionate Software Engineer with a proven track record of solving complex problems and driving innovation, working at ReliaQuest will offer you the opportunity to write groundbreaking code and manipulate data to automate threat detection and response for one of the world's fastest-growing industries. You will play a pivotal role in creating, testing, and deploying cutting-edge security technology for enterprise customers globally. This role will not only allow you to collaborate with top talent in the industry but also make a direct contribution to the growth and success of RQ. Key Responsibilities: - Research and develop innovative solutions using cutting-edge technologies to enhance our platform, GreyMatter. - Develop REST APIs and integrations to enhance and automate threat detection for customers. - Manage complex technology deployment processes through continuous integration. - Conduct code reviews to ensure ongoing improvement. - Proactively automate and enhance the software development lifecycle. - Collaborate closely with different parts of the business to ensure seamless product utilization. - Provide support to team members and foster a culture of collaboration. Qualifications Required: - 2-4 years of Software Development experience in Python, JS, React, Angular, Java, C#, MySQL, Elastic Search, or equivalent technologies. - Proficiency in written and verbal English. What makes you stand out: - Hands-on experience with technologies such as Elasticsearch, Kafka, Apache Spark, Logstash, Hadoop/hive, Tensorflow, Kibana, Athena/Presto/BigTable, Angular, React. - Familiarity with cloud platforms like AWS, GCP, or Azure. - Strong understanding of unit testing, continuous integration, and deployment. - Experience with Agile Methodology. - Higher education or relevant certifications.,
ACTIVELY HIRING
question

Are these jobs relevant for you?

posted 1 month ago

React Developer

Arcot Group
experience2 to 6 Yrs
location
Maharashtra
skills
  • JavaScript
  • RESTful APIs
  • NPM
  • Reactjs
  • ES6 syntax
  • Redux
  • Context API
  • asynchronous programming
  • Webpack
  • Babel
Job Description
As a React Developer at Arcot Group, you will be responsible for building user-friendly and dynamic web applications using React.js. You will work closely with designers and backend developers to create high-performance applications that deliver exceptional user experiences. Key Responsibilities: - Developing and implementing user interface components using React.js. - Translating designs and wireframes into high-quality code. - Building reusable components and front-end libraries for future use. - Optimizing components for maximum performance across various web-capable devices and browsers. - Collaborating with backend developers to integrate APIs and improve application functionality. - Participating in code reviews and ensuring adherence to best practices. - Keeping abreast of the latest industry trends and technologies. Qualifications Required: - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as a React Developer or similar role. - Strong proficiency in JavaScript, including ES6+ syntax and features. - Experience with state management libraries (e.g., Redux, Context API). - Familiarity with RESTful APIs and asynchronous programming. - Understanding of front-end development tools such as Webpack, Babel, and NPM. - Excellent problem-solving skills and the ability to work collaboratively in a team environment.,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
All India
skills
  • Python
  • Java
  • C
  • System integration
  • Software engineering
  • Data structures
  • Algorithms
  • Software design patterns
  • Google Technologies
  • Protocol Buffers
  • Malware analysis
  • Reverse engineering
  • API design
  • GoogleSQL
  • Android security ecosystem
  • Data processing pipelines
Job Description
Role Overview: You will be responsible for implementing backend integrations for in-house and open-source tools, resolving bugs, and adding new features to existing tools. Your role will also involve full-stack development work, requiring you to work on both frontend and backend tasks. Key Responsibilities: - Implement backend integrations for in-house and open-source tools into the analysis pipeline - Resolve bugs and add new features to existing tools - Work on both frontend and backend development tasks - Design APIs and integrate systems - Utilize strong analytical and problem-solving skills to address complex technical challenges Qualifications Required: - Strong proficiency in general-purpose programming languages such as Python, Java, or C++ - Experience in backend development, including API design and system integration - Solid understanding of software engineering principles, data structures, algorithms, and software design patterns - Familiarity with Google technologies like Boq, Borg, Bigtable, Blobstore, Spanner, Goops (Pub/Sub), and Flume - Proficiency in Protocol Buffers for data serialization and communication - Experience with GoogleSQL for data querying and analysis - Knowledge of malware analysis, reverse engineering concepts, and the Android security ecosystem - Experience in building and maintaining large-scale data processing pipelines - Experience in developing tools for security professionals or technical audiences Additional Company Details: The company operates in the IT Services and IT Consulting sector, with a focus on providing software development services. The company's website is http://www.virtusa.com. The company offers a platform called TALENTMATE Portal, aimed at facilitating the hiring process for professionals in the industry.,
ACTIVELY HIRING
posted 3 days ago
experience3 to 15 Yrs
location
Maharashtra
skills
  • Python
  • JS
  • Angular
  • Java
  • C
  • MySQL
  • Elastic Search
  • Elasticsearch
  • Kafka
  • Apache Spark
  • Logstash
  • Hadoop
  • Hive
  • Kibana
  • Athena
  • Presto
  • BigTable
  • AWS
  • GCP
  • Azure
  • unit testing
  • continuous integration
  • Agile Methodology
  • React
  • Tensorflow
Job Description
Role Overview: As a Software Engineer at ReliaQuest, you will have the opportunity to work on cutting-edge technologies and drive the automation of threat detection and response for a rapidly growing industry. You will be responsible for researching and developing creative solutions, creating REST APIs, managing deployment processes, performing code reviews, and automating various stages of the software development lifecycle. Collaboration with internal and external stakeholders will be key to ensure seamless product utilization. Key Responsibilities: - Research and develop solutions using cutting-edge technologies to evolve the GreyMatter platform - Create REST APIs and integrations to enhance and automate threat detection for customers - Manage continuous integration and deployment processes for complex technologies - Conduct code reviews to ensure consistent improvement - Automate and enhance all stages of software development lifecycle - Collaborate closely with different parts of the business to facilitate easy product utilization - Provide support to team members and foster a culture of collaboration Qualifications Required: - 3-6 years of Software Development experience for mid-level roles and 7-15 years for Senior-level positions in Python, JS, React, Angular, Java, C#, MySQL, Elastic Search or equivalent - Proficiency in written and verbal English - Hands-on experience with technologies such as Elasticsearch, Kafka, Apache Spark, Logstash, Hadoop/hive, Tensorflow, Kibana, Athena/Presto/BigTable, Angular, React - Familiarity with cloud platforms like AWS, GCP, or Azure - Strong understanding of unit testing, continuous integration, and deployment practices - Experience with Agile Methodology - Higher education or relevant certifications This job at ReliaQuest offers you the chance to be part of a dynamic team working on groundbreaking security technology. Join us to contribute to the growth and success of the company while learning from some of the best in the industry.,
ACTIVELY HIRING
posted 2 weeks ago
experience5 to 9 Yrs
location
Kerala, Thiruvananthapuram
skills
  • Java
  • Google Cloud Platform
  • SQL
  • Data engineering frameworks
  • Apache Beam
  • Apache Airflow
  • GCP Big Data Services
Job Description
As a Senior Cloud Developer at our company, you will be responsible for designing, developing, and deploying scalable data processing pipelines and orchestration workflows. Your expertise in Java (v17+), Google Cloud Platform (GCP), and data engineering frameworks will be crucial in ensuring high performance, reliability, and maintainability across large-scale systems. Key Responsibilities: - Design, develop, test, and deploy scalable and reliable data processing pipelines using Java 17+ and Apache Beam, executed on GCP Cloud Dataflow - Build and manage complex data orchestration workflows using Apache Airflow or GCP Cloud Composer, including creating and maintaining DAGs with various common and custom operators - Interact with and optimize data stored in GCP Big Data services such as BigQuery, BigTable, and Google Cloud Storage (GCS) - Write, optimize, and review complex SQL queries for data retrieval, transformation, and analysis primarily within BigQuery and Cloud SQL - Ensure adherence to code quality, testing standards, and technical documentation - (Added Advantage) Deploy and manage applications on GKE (Google Kubernetes Engine), manage security and permissions through GCP IAM, work with Cloud Spanner, and develop secure APIs for data access Qualifications Required: - Mandatory Skills: Java 17+, GCP Cloud Dataflow (Apache Beam programming), Airflow/Cloud Composer (Creation and maintenance of DAGs and common operators), GCP Big Data Services (BigQuery, BigTable, GCS, Cloud SQL), Strong SQL programming and optimization skills - Skills: GCP, JAVA, Cloud, Dataflow, SQL Thank you for considering this opportunity with us.,
ACTIVELY HIRING
posted 2 weeks ago

Senior Data Engineer

People Prime Worldwide
experience8 to 12 Yrs
location
All India, Hyderabad
skills
  • Python
  • GCP
  • ETL
  • SQL
  • GitHub
  • Apache Spark
  • Kafka
  • MongoDB
  • Redis
  • Bigtable
  • Snowflake
  • Dataflow
  • BigQuery
  • Cloud Functions
  • Cloud Composer
  • CICD
  • FastAPI
  • Databricks
  • GKE
  • Azure Data Factory
Job Description
As a Senior Data Engineer at the company, your role involves designing, developing, and maintaining scalable ETL pipelines and data systems using Python and Google Cloud Platform (GCP). Your responsibilities include: - Designing, developing, testing, and maintaining robust ETL data pipelines using Python. - Working extensively with GCP services such as Dataflow, BigQuery, Cloud Functions, Cloud Composer (Airflow), IAM, Cloud Run, and Google Cloud Storage. - Implementing data ingestion, transformation, and validation logic to maintain data quality and consistency. - Collaborating with cross-functional teams, including data scientists and analysts, to deliver reliable data solutions. - Managing version control through GitHub and contributing to CI/CD pipelines for data projects. - Writing and optimizing complex SQL queries across databases like SQL Server, Oracle, and PostgreSQL. - Creating and maintaining documentation, including data flow diagrams and process documentation. Your technical expertise should include: - Strong proficiency in Python for backend or data engineering projects. - Deep working knowledge of GCP services, especially Dataflow, BigQuery, Cloud Functions, and Cloud Composer. - Experience with data orchestration and workflow tools like Airflow. - Proficiency in Apache Spark and Kafka for data processing and streaming. - Hands-on experience with FastAPI, MongoDB, Redis/Bigtable. - Sound understanding of CI/CD practices and version control systems like GitHub. - Advanced SQL skills and experience with enterprise-grade relational databases. - Solid experience in cloud migration and large-scale data integration. It would be nice to have experience with Snowflake or Databricks for big data analytics and familiarity with GKE, Cloud Run deployments, or Azure Data Factory. As a Senior Data Engineer at the company, your role involves designing, developing, and maintaining scalable ETL pipelines and data systems using Python and Google Cloud Platform (GCP). Your responsibilities include: - Designing, developing, testing, and maintaining robust ETL data pipelines using Python. - Working extensively with GCP services such as Dataflow, BigQuery, Cloud Functions, Cloud Composer (Airflow), IAM, Cloud Run, and Google Cloud Storage. - Implementing data ingestion, transformation, and validation logic to maintain data quality and consistency. - Collaborating with cross-functional teams, including data scientists and analysts, to deliver reliable data solutions. - Managing version control through GitHub and contributing to CI/CD pipelines for data projects. - Writing and optimizing complex SQL queries across databases like SQL Server, Oracle, and PostgreSQL. - Creating and maintaining documentation, including data flow diagrams and process documentation. Your technical expertise should include: - Strong proficiency in Python for backend or data engineering projects. - Deep working knowledge of GCP services, especially Dataflow, BigQuery, Cloud Functions, and Cloud Composer. - Experience with data orchestration and workflow tools like Airflow. - Proficiency in Apache Spark and Kafka for data processing and streaming. - Hands-on experience with FastAPI, MongoDB, Redis/Bigtable. - Sound understanding of CI/CD practices and version control systems like GitHub. - Advanced SQL skills and experience with enterprise-grade relational databases. - Solid experience in cloud migration and large-scale data integration. It would be nice to have experience with Snowflake or Databricks for big data analytics and familiarity with GKE, Cloud Run deployments, or Azure Data Factory.
ACTIVELY HIRING
posted 2 months ago

GCP Data Engineer

Tredence Inc.
experience5 to 9 Yrs
location
Maharashtra, Pune
skills
  • SQL
  • Apache Spark
  • Python
  • Hadoop
  • Spark
  • SQL
  • BigTable
  • Cloud Storage
  • Machine Learning
  • Java
  • Python
  • BigQuery
  • Cloud ComposerPython
  • Cloud Functions
  • Dataproc with PySpark
  • Python Injection
  • Dataflow with PUBSUB
  • Apache Beam
  • Google Dataflow
  • BigQuery
  • Datastore
  • Spanner
  • Cloud SQL
  • Snaplogic
  • Cloud Dataprep
Job Description
As a GCP Data Engineer, you will be responsible for designing and implementing solutions on Google Cloud Platform (GCP) utilizing various GCP components. Key Responsibilities: - Implementing and architecting solutions on GCP using components such as BigQuery, SQL, Cloud Composer/Python, Cloud Functions, Dataproc with PySpark, Python Injection, Dataflow with PUB/SUB. - Experience with Apache Beam, Google Dataflow, and Apache Spark in creating end-to-end data pipelines. - Proficiency in Python, Hadoop, Spark, SQL, BigQuery, BigTable, Cloud Storage, Datastore, Spanner, Cloud SQL, and Machine Learning. - Programming expertise in Java, Python, and other relevant technologies. - Certified in Google Professional Data Engineer/Solution Architect would be a significant advantage. Qualifications Required: - Minimum of 5 years of IT or professional services experience in IT delivery or large-scale IT analytics projects. - In-depth knowledge of Google Cloud Platform; familiarity with other cloud platforms is a plus. - Expertise in SQL development and building data integration tools using cloud technologies like Snaplogic, Google Dataflow, Cloud Dataprep, and Python. - Identifying downstream implications of data loads/migration and implementing data pipelines for automation, transformation, and augmentation of data sources. - Advanced SQL writing skills, experience in data mining, ETL processes, and using databases in a complex business environment. You should be able to work effectively in a dynamic business environment and provide scalable data solutions for simplified user access to extensive datasets.,
ACTIVELY HIRING
posted 7 days ago

R&D Engineer / VLSI Engineer

MIRROR INSTITUTE FOR EMBEDDED TECHNOLOGY
experience0 to 4 Yrs
location
All India
skills
  • VLSI
  • Verilog
  • SystemVerilog
  • C
  • C
  • ModelSim
  • Altera Quartus
  • Digital Electronics
  • Xilinx Vivado
Job Description
Role Overview: Join our R&D division at Mirror Institute for Embedded Technology (MIET) in Chennai to learn, design, and innovate in VLSI and Embedded Technologies. You will have the opportunity to gain hands-on experience with FPGA/ASIC Design, Verilog/SystemVerilog, Xilinx & Mentor Graphics tools, and work on industry-grade projects. Additionally, there is the potential for growth as a trainer, researcher, and innovator in advanced chip design and verification domains. Key Responsibilities: - Learn and work on FPGA/ASIC Design projects - Utilize Verilog/SystemVerilog and C/C++ for coding - Work with Xilinx Vivado, ModelSim, Altera Quartus tools - Engage in effective communication and mentoring - Demonstrate a passion for learning and innovation Qualification Required: - M.E. in VLSI Design / Embedded Systems / Power Systems / Power Electronics or M.Sc. in Electronics - Candidates from Anna University, Tamil Nadu (Regular) are preferred - Freshers & Experienced candidates are welcome - Academic criteria: Minimum 70% in UG and 65% in 10th & 12th grades Additional Company Details: Mirror Institute for Embedded Technology (MIET) is located at 184/2, 3rd Floor, Chandamama Building, Arcot Road, Vadapalani, Chennai 600026. Our office is situated conveniently opposite Kamala Theater, above Viveks Showroom. MIET emphasizes a collaborative and innovative environment where employees have the opportunity for professional growth and skill development. For further inquiries or to apply, please contact us at hrmirrorinstitute@gmail.com or call 93809 48474 / 93819 48474. This is a full-time position with a contract term of 3 years. Work location is in person.,
ACTIVELY HIRING
posted 2 days ago

Lead GCP Data Engineer

People Prime Worldwide
experience10 to 14 Yrs
location
Hyderabad, Telangana
skills
  • Python
  • GCP
  • Kafka
  • DBT
  • DLP
  • Apache Spark
  • MongoDB
  • Redis
  • Bigtable
  • Airflow
  • GitHub
  • SQL
  • Snowflake
  • GKE
  • LLMs
  • FastAPI
  • Databricks
  • Azure Data Factory
Job Description
As a skilled and motivated Lead Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP), your role will involve designing, architecting, and maintaining robust and scalable ETL & ELT data pipelines. You will work directly with customers, gather requirements, design solutions, and implement data transformations using various GCP services. Your experience level of 10 to 12 years will be crucial in fulfilling the following key responsibilities: - Designing, developing, testing, and maintaining scalable ETL data pipelines using Python. - Architecting enterprise solutions with technologies like Kafka, multi-cloud services, GKE auto-scaling, Load balancers, APIGEE proxy API management, DBT, redaction of sensitive information, and DLP. - Working extensively on GCP services such as Dataflow, Cloud Functions, BigQuery, Cloud Composer, Google Cloud Storage, IAM, and Cloud Run. - Implementing data ingestion, transformation, and cleansing logic to ensure high-quality data delivery. - Enforcing data quality checks, validation rules, and monitoring. - Collaborating with data scientists, analysts, and engineering teams to understand data needs and deliver efficient data solutions. - Managing version control using GitHub and participating in CI/CD pipeline deployments. Your required skills include: - Hands-on experience in Python for backend or data engineering projects. - Strong understanding and working experience with GCP cloud services. - Solid understanding of data pipeline architecture, data integration, and transformation techniques. - Experience with version control systems like GitHub and CI/CD practices. - Familiarity with Apache Spark, Kafka, Redis, Fast APIs, Airflow, SQL databases, and data migrations to Cloud platforms. Good to have skills: - Experience working with Snowflake cloud data platform. - Hands-on knowledge of Databricks for big data processing. - Familiarity with Azure Data Factory and other Azure data engineering tools. In addition to the above, the company values problem-solving and analytical skills, strong communication skills, and the ability to collaborate in a team environment. Education: - Bachelor's degree in Computer Science, a related field, or equivalent experience.,
ACTIVELY HIRING
posted 2 months ago

Full Stack Developer (Node.js & React.js)

e-Stone Information Technology Private Limited
experience4 to 8 Yrs
location
Maharashtra
skills
  • JavaScript
  • RESTful APIs
  • WebSockets
  • PostgreSQL
  • MySQL
  • MongoDB
  • Bigtable
  • Jenkins
  • Nodejs
  • Reactjs
  • TypeScript
  • Nextjs
  • GraphQL
  • Firestore
  • Google Cloud Platform GCP
  • CICD pipelines
  • GitHub Actions
  • GitLab CICD
  • Security best practices
Job Description
As a talented Full Stack Developer with expertise in Node.js and React.js, your role will involve developing and maintaining scalable web applications with high performance. You will work on both frontend and backend development, ensuring the seamless integration of React.js for the frontend and Node.js (Express/Next.js) for the backend. Your responsibilities will include: - Developing and maintaining scalable, high-performance web applications - Designing, developing, and managing RESTful APIs and integrating third-party services - Implementing authentication and authorization (OAuth, JWT, Firebase Auth) - Optimizing applications for speed and scalability - Deploying and managing applications on Google Cloud Platform (GCP) services - Writing clean, maintainable, and well-documented code following best practices - Collaborating with front-end developers, DevOps, and other stakeholders Required Skills & Qualifications: - Strong proficiency in JavaScript/TypeScript - Hands-on experience with React.js, Next.js for frontend development - Experience with RESTful APIs, GraphQL, and WebSockets - Hands-on experience with relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Bigtable, Firestore) databases - Familiarity with Google Cloud Platform (GCP) services - Knowledge of CI/CD pipelines (GitHub Actions, GitLab CI/CD, Jenkins) - Strong understanding of security best practices Preferred Qualifications (Nice to Have): - Familiarity with serverless computing (Cloud Functions, AWS Lambda, Firebase) - Knowledge of performance tuning and application monitoring tools - Exposure to Bun as an alternative runtime Please share your updated resume at ruchita.parsekar@e-stonetech.com.,
ACTIVELY HIRING
posted 2 months ago

Cloud Data Engineer

Niveus Solutions
experience3 to 7 Yrs
location
Karnataka
skills
  • SQL
  • Oracle
  • Postgres
  • Data Mart
  • GCP
  • Java
  • Python
  • Data Engineering
  • ETL
  • Spring Boot
  • Microservices
  • Nosql Database
  • DW
  • Data modelling
  • BigQuery
  • Apache Beam
  • Google Cloud BigTable
  • Google BigQuery
  • PubSub
  • Cloud Run
  • Cloud Function
Job Description
Role Overview: You will be responsible for building end-to-end data applications by integrating backend APIs with analytical front-ends. With over 3 years of IT experience, you are expected to have a good understanding of analytics tools for effective data analysis. Your ability to learn new tools and technologies will be crucial for this role. You should have prior experience working with at least one Structural (SQL/Oracle/Postgres) and one NoSQL database. A strong understanding of Data Warehouse (DW), Data Mart, and Data Modelling concepts is essential, and you should have been part of a Data Warehouse design team in at least one project. Key Responsibilities: - Develop high performance and scalable solutions using Google Cloud Platform (GCP) to extract, transform, and load big data. - Design and build production-grade data solutions using Java/Python for ingestion to consumption. - Design and optimize data models on GCP cloud utilizing GCP data stores like BigQuery. - Optimize data pipelines for performance and cost efficiency in large-scale data lakes. - Write complex and highly-optimized queries across extensive datasets to create data processing layers. - Collaborate closely with Data Engineers to select the right tools for delivering product features by conducting Proof of Concepts. - Research new use cases for existing data and work as a collaborative team player with business, BAs, and other Data/ML engineers. Qualifications Required: - Good understanding of Design Best practices for OLTP and OLAP Systems. - Exposure to load testing methodologies, debugging pipelines, and delta load handling. - Experience in creating Dag Files using Python and SQL for ETL processes. - Familiarity with exploratory analysis of logs. Experience as an Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable. - Proficiency in Google Cloud Platform (GCP) and writing batch and stream processing jobs using Apache Beam Framework (Dataflow). - Experience with Spring Boot, knowledge of Microservices, Pub/Sub, Cloud Run, and Cloud Function will be beneficial for this role.,
ACTIVELY HIRING
posted 2 months ago

IT Security Analyst

The Citco Group Limited
experience3 to 7 Yrs
location
Hyderabad, Telangana
skills
  • IDM
  • Siteminder
  • Connectors
  • Core Java
  • Application Servers
  • JBOSS
  • Tomcat
  • Apache
  • Troubleshooting
  • Analysis
  • Development
  • Testing
  • Training
  • Communication
  • Deployment
  • Maintenance
  • Documentation
  • Information Security
  • Access management
  • CA Identity Manager
  • SAML
  • Federation
  • Certificate Authority
  • IAM services
  • Arcot
  • SDK libraries
  • APIJDBC interfaces
  • Java framework
  • Arcot custom flows
  • Authentication rules
  • Production issues
  • Patching verification
  • IAM product upgrades
  • CA IDM components
  • Identity Policy
  • Password policy
  • Policy xpress
  • CA Identity Governance
  • CA Identity Portal
  • IDM SDK
  • ODSEE
  • OUD
  • LDAP Directory upgrades
  • Troubleshooting directory issues
  • Unix environments
  • Windows environments
  • CA SiteMinder Administration
  • Single SignOn
  • CA Strong Authentication support
  • Privilege Access Management
Job Description
Role Overview: As an IT IAM Security Analyst at Citcos, your primary role involves the development and support of Identity and Access Management (IAM) services. You will be responsible for customizing IAM products such as IDM, Arcot, and Siteminder using SDK libraries. Collaboration with application IT teams to develop API/JDBC interfaces for managing application access and creating a java framework to aggregate user access from applications will also be part of your responsibilities. Additionally, you will work on developing custom flows in Arcot to handle authentication rules for different user groups and handle complex production issues, patching verification, and IAM product upgrades. Interacting with various support and development groups, security team staff, business management, and end-users is an essential part of your duties. Key Responsibilities: - Hands-on experience in CA IDM components like tasks, screens, BLTH, Identity Policy, password policy, and policy xpress - Proficiency in CA Identity Governance, CA Identity Portal endpoint integration, and coding knowledge for connectors - Experience with Core Java, IDM SDK, and customizing connectors - Knowledge of ODSEE/ OUD, LDAP Directory upgrades, and troubleshooting directory issues - Installation and troubleshooting of applications in Unix and Windows environments - Familiarity with Application Servers such as JBOSS, Tomcat, Apache - Troubleshooting and resolving issues related to identities, systems, access, accounts, authentication, authorization, entitlements, and permissions - Providing analysis, development, testing, training, communication, deployment, and maintenance of IAM systems - Documenting processes, procedures, standards, and guidelines related to Information Security - Collaborating with internal stakeholders to identify access management requirements - Working independently, portraying a professional demeanor, and training other staff members and external clients Qualifications Required: - Bachelor's Degree in Computer Science or related field - Graduate Degree is a plus Desired Knowledge/Skills: - Experience with CA Identity Manager or equivalent Provisioning system - Proficiency in CA SiteMinder Administration - Knowledge of Single Sign-On, SAML, and Federation - Experience with CA Strong Authentication support - Familiarity with Privilege Access Management and Certificate Authority,
ACTIVELY HIRING
posted 2 months ago
experience6 to 10 Yrs
location
Haryana
skills
  • Genomics
  • Hadoop
  • Kafka
  • Spark
  • Pig
  • Hive
  • Java
  • Python
  • ITIL
  • Agile methodologies
  • Cloud SQL
  • Cloud Bigtable
  • Dataflow
  • BigQuery
  • Dataproc
  • Datalab
  • Dataprep
  • Pub Sub
  • Google Transfer Appliance
  • Cloud Storage Transfer Service
  • BigQuery Data Transfer
Job Description
As a part of the team at GlobalLogic working on a significant software project for a world-class company providing M2M / IoT 4G/5G modules to industries like automotive, healthcare, and logistics, you will be involved in developing end-user modules" firmware, implementing new features, maintaining compatibility with the latest telecommunication and industry standards, and conducting analysis and estimations of customer requirements. **Key Responsibilities:** - Experience working with data warehouses, including technical architectures, infrastructure components, ETL / ELT, and reporting / analytic tools. - Experience in technical consulting. - Architecting, developing software, or internet scale production-grade Big Data solutions in virtualized environments like Google Cloud Platform (mandatory) and AWS / Azure (good to have). - Working with big data, information retrieval, data mining, or machine learning, as well as building multi-tier high availability applications with modern web technologies (such as NoSQL, Kafka, NPL, MongoDB, SparkML, Tensorflow). - Working knowledge of ITIL and / or agile methodologies. **Qualifications Required:** - BA / BS degree in Computer Science, Mathematics, or related technical field, or equivalent practical experience. - Experience in Cloud SQL and Cloud Bigtable. - Experience in various tools like Dataflow, BigQuery, Dataproc, Datalab, Dataprep, Pub / Sub, Genomics, Google Transfer Appliance, Cloud Storage Transfer Service, BigQuery Data Transfer. - Experience with data processing software (Hadoop, Kafka, Spark, Pig, Hive) and data processing algorithms (MapReduce, Flume). - Experience in writing software in languages like Java, Python. - 6-10 years of relevant consulting, industry, or technology experience. - Strong problem-solving and troubleshooting skills. - Strong communication skills. At GlobalLogic, you will benefit from a culture of caring that prioritizes people first. You will experience an inclusive culture of acceptance and belonging, build meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities await you at GlobalLogic. With various programs, training curricula, and hands-on opportunities, you can sharpen your skills and advance your career. You will have the chance to work on interesting and meaningful projects at GlobalLogic, making an impact for clients globally. The high-trust organization fosters integrity and trust, ensuring a safe, reliable, and ethical work environment. GlobalLogic, a Hitachi Group Company, is a digital engineering partner to leading companies worldwide, contributing to innovative digital products and experiences since 2000. Join us in transforming businesses and redefining industries through intelligent products, platforms, and services.,
ACTIVELY HIRING
posted 2 days ago
experience0 to 4 Yrs
location
All India
skills
  • Recruitment
  • Employee Engagement
  • Organizational Culture
  • HR Management
  • Communication Skills
  • Microsoft Office
  • Onboarding
  • Administrative Support
  • Employee Feedback
  • Organizational Abilities
Job Description
Role Overview: As an HR Intern at Arcot Group, you will have the opportunity to immerse yourself in the realm of human resources, focusing on recruitment, employee engagement, and organizational culture. This internship offers a valuable experience where you will gain insights into the multifaceted world of human resource management. Your role will involve contributing to our people-centered approach through various responsibilities. Key Responsibilities: - Support the recruitment process by posting job openings, screening applications, coordinating interviews, and aiding in the selection process. - Assist in the onboarding of new employees by preparing onboarding materials, maintaining HR databases, updating employee records, and participating in organizing employee engagement and training programs. - Conduct surveys, collect employee feedback, and provide necessary administrative support to the HR team. Qualifications Required: - Positive attitude and a strong willingness to learn - Excellent communication skills - Ability to work both independently and collaboratively - Basic understanding of HR principles and practices (advantageous) - Proficiency in Microsoft Office tools such as Word, Excel, and PowerPoint Additional Company Details (if available): Join us at Arcot Group for a rewarding internship experience that will pave the way for a successful career in HR.,
ACTIVELY HIRING
posted 1 month ago

Trainee - Digital

Newgen knowledge works
experience0 to 4 Yrs
location
Tamil Nadu
skills
  • XML
  • HTML
  • CSS
  • Microsoft Word
  • Microsoft Excel
  • PDF
Job Description
As a fresher with an undergraduate degree between April 2022 to 2024, you have the opportunity to apply for a remote full-time position with the following key skills: - Basic knowledge of XML/HTML and CSS is a must - Familiarity with Microsoft Word, Excel, and PDF tools The job is located in Ranipet, Vellore, Kaveripakkam, Katpadi, Arcot, or Visharam. The application deadline for this position is June 27, 2025.,
ACTIVELY HIRING
posted 4 days ago
experience4 to 8 Yrs
location
Maharashtra, Pune
skills
  • SQL
  • Spark
  • Scala
  • Python
  • Java
  • Tableau
  • Power BI
  • Alteryx
  • Google Data Products
  • BigQuery
  • Dataproc
  • Dataplex
  • Looker
  • Cloud data fusion
  • Data Catalog
  • Dataflow
  • Cloud composer
  • Analytics Hub
  • PubSub
  • Dataprep
  • Cloud Bigtable
  • Cloud SQL
  • Cloud IAM
  • Google Kubernetes engine
  • AutoML
  • Qlik replicate
  • Qlik
  • Informatica Data Quality
Job Description
As a Data Engineer at Aptiv, you will play a crucial role in designing, developing, and implementing a cost-effective, scalable, reusable, and secure Ingestion framework. Your responsibilities include working closely with business leaders, stakeholders, and source system SMEs to understand and define business needs, translate them into technical specifications, and ingest data into Google Cloud platform, specifically BigQuery. Additionally, you will be involved in designing and implementing processes for data ingestion, transformation, storage, analysis, modeling, reporting, monitoring, availability, governance, and security of high volumes of structured and unstructured data. **Key Responsibilities:** - Pipeline Design & Implementation: Develop and deploy high-throughput data pipelines using the latest GCP technologies. - Subject Matter Expertise: Serve as a specialist in data engineering and Google Cloud Platform (GCP) data technologies. - Client Communication: Engage with clients, understand their requirements, and translate them into technical data solutions. - Technical Translation: Analyze business requirements, convert them into technical specifications, create source-to-target mappings, enhance ingestion frameworks, and transform data based on business rules. - Data Cataloging: Develop capabilities to support enterprise-wide data cataloging. - Security & Privacy: Design data solutions focusing on security and privacy. - Agile & DataOps: Utilize Agile and DataOps methodologies in project delivery. **Qualifications Required:** - Bachelor's or Master's degree in Computer Science, Data & Analytics, or similar relevant subjects. - 4+ years of hands-on IT experience in a similar role. - Proven expertise in SQL, including subqueries, aggregations, functions, triggers, indexes, database optimization, and relational data-based models. - Deep experience with Google Data Products such as BigQuery, Dataproc, Dataplex, Looker, Data Catalog, Dataflow, and more. - Experience in Qlik Replicate, Spark (Scala/Python/Java), and Kafka. - Excellent written and verbal communication skills to convey technical solutions to business teams. - Knowledge of statistical methods, data modeling, and industry standards in the Data and Analytics space. - Ability to work effectively with globally distributed teams. - Proficiency in designing and creating Tableau/Qlik/Power BI dashboards, Alteryx, and Informatica Data Quality. Furthermore, Aptiv provides an inclusive work environment where individuals can grow and develop, irrespective of gender, ethnicity, or beliefs. Safety is a core value at Aptiv, aiming for a world with Zero fatalities, Zero injuries, and Zero accidents. The company ensures resources and support for your family, physical, and mental health with a competitive health insurance package. **Benefits:** - Personal holidays - Healthcare - Pension - Tax saver scheme - Free Onsite Breakfast - Discounted Corporate Gym Membership In addition to the above benefits, you will have the opportunity for professional growth and development in a multicultural environment, access to internal and external training, coaching, certifications, recognition for innovation and excellence, and convenient transportation options at Grand Canal Dock. If you are passionate about data engineering, Google Cloud Platform, and making a positive impact in the mobility industry, we invite you to join Aptiv's IT Data Analytics team and contribute to our mission of creating a safer, greener, and more connected world.,
ACTIVELY HIRING
posted 2 months ago
experience2 to 6 Yrs
location
Karnataka
skills
  • SQL
  • Oracle
  • Postgres
  • Data Mart
  • OLTP
  • Python
  • ETL
  • Google Cloud Platform
  • Spring Boot
  • Microservices
  • Nosql Database
  • DW
  • Data modelling
  • OLAP Systems
  • Load testing methodologies
  • Debugging pipelines
  • Delta load handling
  • Exploratory analysis
  • Apache Beam
  • Google Cloud BigTable
  • Google BigQuery
  • Apache Beam Framework
  • Dataflow
  • PubSub
  • Cloud Run
  • Cloud Function
Job Description
As an experienced IT professional with over 2 years of experience, you should have a good understanding of analytics tools to effectively analyze data. You should also possess the ability to learn new tools and technologies. Your previous work experience should include working with at least one Structural database (such as SQL, Oracle, or Postgres) and one NoSQL database. It is essential to have a strong understanding of Data Warehouse (DW), Data Mart, and Data Modelling concepts. You should have been a part of a Data Warehouse design team in at least one project. Key Responsibilities: - Be aware of design best practices for OLTP and OLAP systems - Have exposure to load testing methodologies, debugging pipelines, and delta load handling - Create Dag files using Python and SQL for ETL processes - Conduct exploratory analysis of logs - Experience with Apache Beam development using Google Cloud BigTable and Google BigQuery is desirable - Familiarity with Google Cloud Platform (GCP) - Ability to write batch and stream processing jobs using Apache Beam Framework (Dataflow) - Experience with Spring Boot - Knowledge of Microservices, Pub/Sub, Cloud Run, and Cloud Function Qualifications Required: - Minimum 2+ years of IT experience - Strong understanding of analytics tools - Proficiency in at least one Structural and one NoSQL database - Knowledge of Data Warehouse, Data Mart, and Data Modelling concepts - Experience in Data Warehouse design team - Familiarity with Python and SQL for ETL processes - Exposure to Apache Beam development and Google Cloud services (Note: Any additional details about the company were not provided in the job description.),
ACTIVELY HIRING
posted 2 months ago

HR MANAGER

Arcot manimark foods private limited
experience5 to 9 Yrs
location
Tamil Nadu
skills
  • Strategic Planning
  • Policy Development
  • Talent Acquisition
  • Employee Development
  • Performance Management
  • Compensation
  • Benefits
  • Employee Relations
  • Compliance Management
  • HR Analytics
  • Change Management
Job Description
**Job Description:** As an HR Manager, you will play a crucial role in the strategic planning of the organization by collaborating with senior leadership to develop and implement HR strategies that align with the overall goals of the company. **Key Responsibilities:** - Develop and update HR policies and procedures to ensure compliance with laws and regulations. - Oversee the recruitment process, including job postings, interviews, and onboarding of new employees. - Implement training programs and career development initiatives to enhance employee skills and job satisfaction. - Design and manage performance evaluation systems to measure and improve employee productivity. - Develop and administer competitive compensation packages and employee benefit programs. - Address employee concerns, mediate conflicts, and foster a positive work environment. - Ensure compliance with labor laws, regulations, and industry standards. - Utilize HR analytics to make informed decisions about workforce planning and talent management. - Lead organizational change initiatives and help employees adapt to new processes or structures. **Qualifications Required:** - Bachelor's degree in Human Resources or related field. - Proven experience as an HR Manager or similar role. - In-depth knowledge of labor laws and regulations. - Strong leadership and communication skills. - Ability to analyze data and make strategic decisions. - Experience in change management is a plus. *Note: This job is full-time with benefits including cell phone reimbursement, provided food, and provident fund. The schedule is a day shift with a yearly bonus. The work location is in person.*,
ACTIVELY HIRING
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter