data-validation-jobs-in-bangalore, Bangalore

57 Data Validation Jobs in Bangalore

Toggle to save search
posted 2 months ago

PERMANENT WORK FROM HOME - CONTENT WRITER for TOP MNC in INDIA (Remote)

Futurz Consultancy Services Hiring For REPUTED TOP MNC IN INDIA
experience0 to 3 Yrs
Salary3.0 - 6 LPA
WorkRemote
location
Bangalore, Chennai+4

Chennai, Hyderabad, Kolkata, Mumbai City, Delhi

skills
  • journalism
  • article writing
  • content writing
  • mass communication
  • content writer
Job Description
  Greetings! Offers a professional work environment, outstanding challenges, industry competitive compensation, accelerated career growth and overseas opportunities.   Mode of Work : Work From Home (WFH) (Applicants should be flexible to work from office in Future as per business requires) Work Location : Anywhere in India (Pan India Location) - WFH   Process: News Analyst (Nonvoice Process) Experience Required for the Job: Freshers only CTC Max.: 3LPA + NSA (Appx. Monthly Rs. 25000 + 3000 NSA) Shift timings: Rotational Shifts (Day or Night Flexible - Must Flexible) - 5Days/Week    Mandatory Skills : - * Excellent Verbal and written communication - English, Knowledge on News, current affairs, Media understanding   Qualification: Any Graduate / Postgraduate with Specialization of Journalism / Mass Communications / Political Science   About the role: As a News Analyst, you will be an integral part of our algorithm training process, serving as a resource to help define factors that are important to our clients. Who you are: You are a highly motivated new graduate or professional journalist who loves being at the forefront of breaking news and has a passion for technology, social media and online content. Desired Skills & Experience: Degree, preferably in Journalism, Communications, English, political/social sciences or related fields As a reporter covering breaking news, or working in digital media, or educational equivalent Excellent written and verbal communication skills Strong organizational skills, comfortable managing multiple competing priorities Ability to monitor and analyze real-time data from multiple datasets Audit and programmatically apply business rules for data validation QA algorithmic datastreams to improve the quality of the dataset output Participate in UX user research studies for internal QA tools Stay up-to-date on new policies, processes, and procedures impacting the QA workflow Able to adapt quickly in a rapidly changing environment Goal and result-oriented mindset Professional proficiency in a foreign language a plus   Basic Requirement: Excellent Communications Skills Experience working in an external client facing environment Strong Excel and MS Office skills Ability to work cross-functionally across internal and external stakeholders with a high bias for action Sound judgment, attention to detail and flexibility in balancing program requirements, tight deadlines, and keeping people and projects moving on schedule Comfort and experience with a fast paced start-up environment Fluent in English & an excellent communicator   Mandatory Check List: * Applicants Must have Wifi Internet Facility with Min. 150 MBPS at Home Resident & Must be resident of above citys only. * Applicants should have Education Certificates ( 10 + 12th + Graduation Certificates with PC & CMM) * Must have Pancard + Aadhar Card + Passport / Election ID Card   Note: It"s a group mail so if that is not matching to your profile please ignore it. Please forward it to your friends who are looking for change.   More Details Contact : 9182575391     ************* Applicants available to Join immediately would be preferable *************.   Thanks & Regards, Sana FUTURZ CONSULTANCY SERVICES Flat # 305, Prime Plaza Complex, Himayath Nagar, Hyderabad 500029, Telangana State, India
INTERVIEW ASSURED IN 15 MINS

Top Companies are Hiring in Your City

For Multiple Roles

Jio Platforms Ltd
Jio Platforms Ltdslide-preview-Genpact
posted 2 months ago

Epic Software Engineer Remote

Venpa Global Technologies Private Limited
experience4 to 7 Yrs
WorkRemote
location
Bangalore, Chennai+3

Chennai, Hyderabad, Gurugram, Delhi

skills
  • javascript
  • node.js
  • epic systems
  • hipaa
  • fhir
  • epic apis
  • react
Job Description
Epic Software Engineer Budget : 150000/month Remote*  About the role Our client is hiring an Epic Software Engineer to build and integrate apps that run inside Epic and connect with a patient services platform. You will design secure, scalable workflows for enrollment, consent, surveys, provider updates, and analytics using Epic APIs, SMART on FHIR, HL7 v2, and modern web technologies.  4+ years of professional software engineering experience, with at least 2 years in healthcare integrations or EHR app development. Hands-on experience with Epic APIs, SMART on FHIR app development, and FHIR resource modeling. Strong proficiency with web development using JavaScript or TypeScript, React, and HTML/CSS. Proficiency building RESTful APIs and JSON contracts, including request validation, versioning, and backward compatibility strategies. Practical knowledge of HL7 v2 segments, message types, ACK handling, and interface engines. Experience implementing OAuth2.0; familiarity with SAML for enterprise SSO. Solid backend skills in Node.js or PHP; experience with C#/.NET or Java/Kotlin is a plus. SQL expertise with PostgreSQL or MS SQL, including schema design and query tuning. Docker-based development and AWS deployment experience.  Preferred qualifications Experience publishing Epic apps through Epic programs and working with Epic client teams for onboarding. Familiarity with interface engines such as Mirth, Rhapsody, or Cloverleaf. Knowledge of consent frameworks, TCPA considerations for SMS, and de-identification techniques. Experience with SFTP-based data exchange at scale, checksum validation, and idempotent ingestion design. CI/CD with GitHub Actions, GitLab CI, or similar, plus infrastructure as code on AWS. Observability with tools such as CloudWatch, OpenTelemetry, or Datadog. Tableau or analytics pipeline experience for clinical or brand reporting. Mobile development exposure in Swift or Objective-C for iOS and Java or Kotlin for Android for companion apps.  
INTERVIEW ASSURED IN 15 MINS
posted 2 weeks ago

Abinitio Developer

CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
experience5 to 10 Yrs
location
Bangalore, Noida+7

Noida, Chennai, Hyderabad, Gurugram, Kolkata, Pune, Mumbai City, Delhi

skills
  • ab initio
  • unix shell scripting
  • sql
Job Description
Key Responsibilities Design, develop, and implement ETL processes using Ab Initio GDE (Graphical Development Environment). Build and maintain Ab Initio graphs, plans, and sandboxes for data extraction, transformation, and loading. Work with business teams to understand data integration requirements and deliver efficient solutions. Use Ab Initio EME for version control, dependency management, and metadata governance. Perform data profiling, data validation, and quality checks using Ab Initio components and tools. Optimize ETL workflows for performance, scalability, and maintainability. Implement robust error handling, restartability, and logging mechanisms. Collaborate with DBAs, data modelers, and analysts to ensure data accuracy and consistency. Schedule and monitor jobs using Ab Initio Control Center (AICC) or enterprise schedulers. Support production systems, troubleshoot issues, and perform root cause analysis. Required Technical Skills Strong hands-on experience in Ab Initio GDE, EME, Co>Operating System, and Control Center. Proficiency with Ab Initio components such as Input/Output, Transform, Partition, Sort, Join, Lookup, Rollup, Reformat, Scan, and Dedup Sort, along with error handling using Rejects, Error Tables, and Error Ports for robust ETL design. Expertise in ETL design, development, and deployment for large-scale data environments. Proficiency in SQL and relational databases such as Oracle, Teradata, DB2, or SQL Server. Experience with UNIX/Linux shell scripting for automation and workflow integration. Understanding of data warehousing concepts (star schema, snowflake schema, slowly changing dimensions). Strong performance tuning and debugging skills in Ab Initio. Familiarity with data quality, metadata management, and data lineage.  
INTERVIEW ASSURED IN 15 MINS
question

Are these jobs relevant for you?

posted 3 weeks ago

Data Engineer

NTECH IT SOLUTIONS PRIVATE LIMITED
NTECH IT SOLUTIONS PRIVATE LIMITED
experience8 to 13 Yrs
Salary28 - 34 LPA
location
Bangalore
skills
  • informatica
  • mdm
  • snow flake
  • master data management
Job Description
Key Responsibilities: Design, develop, implement, support, and maintain Informatica MDM solutions that align with business requirements and data governance policies Configure and customize Informatica MDM hubs Develop data models, matching rules, survivorship rules, and validation processes within the Informatica MDM platform Create and optimize data integration workflows for loading, cleansing, profiling, enriching, and synchronizing master data Implement data quality rules and data standardization processes Design and develop batch and real-time interfaces between MDM and source/target systems Troubleshoot and resolve complex data integration and MDM issues to ensure data integrity and performance Collaborate with data architects to design and implement data governance frameworks Work with business users to gather requirements and translate them into technical specifications Document technical designs, processes, and procedures Create and maintain technical documentation for MDM implementations Perform code reviews and ensure adherence to best practices Provide technical guidance and mentorship to junior team members Stay current with latest developments in Informatica products and MDM technologies  YOU HAVE: A Bachelors or Masters degree in Computer Science, Engineering or a related field. Previous experience in or with 8+ years experience as a Software Engineer, Data Engineer, or Data Analyst 5+ years of experience in data management with at least 3 years of hands-on experience with Informatica MDM Strong understanding of Informatica MDM architecture and components (Hub, Manager, Administrator, Operational UI) Proficiency in Informatica MDM configuration including data models, hierarchies, matching rules, and survivorship rules Experience with Informatica PowerCenter or Informatica Cloud for data integration with MDM Working knowledge of Informatica Data Quality (IDQ) and Address Doctor Strong SQL skills and experience with cloud (Snowflake) and relational databases (SQL Server) Experience with web services and API integration (REST, SOAP) Understanding of data modeling concepts (dimensional modeling, entity relationship diagrams) Knowledge of XML, XSLT, and Java/JavaScript for MDM customization Experience with agile development methodologies Strong problem-solving and analytical skills Excellent communication skills and ability to translate business requirements into technical solutions
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago

Data Engineer

Lorven Technologies Private Limited
experience6 to 10 Yrs
WorkRemote
location
Bangalore, Chennai+6

Chennai, Noida, Kolkata, Gurugram, Pune, Mumbai City, Delhi

skills
  • azure
  • data engineering
  • data
  • bricks
  • factory
  • .net
  • python
Job Description
Data Engineer Azure Data Factory, SQL, Python, Databricks, ETL  Data Engineer .Net + Azure Data Factory  We are seeking a skilled Data Engineer to join our team. The ideal candidate will be responsible for designing, implementing, and maintaining data systems and architectures that support our business needs. This role requires expertise in working with large datasets, cloud technologies, and advanced data platforms to ensure the availability, quality, and accessibility of data for business analytics and decision-making. Key Responsibilities: Design, build, and maintain data pipelines to collect, process, and store data efficiently from various sources. Work with cross-functional teams to understand business requirements and deliver data solutions that meet the needs of the organization. Optimize data storage and retrieval methods to enhance system performance and data quality. Integrate data from multiple sources (databases, data lakes, cloud storage, etc.) to build comprehensive data sets. Build and maintain data infrastructure, ensuring it is scalable, reliable, and secure. Implement and manage data governance policies, ensuring data accuracy, consistency, and compliance. Conduct data modeling and provide insights into the data through advanced analytics tools and reports. Perform data transformations and data wrangling tasks to prepare data for analysis. Troubleshoot data issues and collaborate with stakeholders to resolve technical challenges. Ensure the integrity of data pipelines and systems by conducting routine testing and validation. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, Information Systems, or related field. Proven experience as a Data Engineer or in a similar role, with a minimum of [X] years of experience. Proficiency in data engineering tools and technologies such as SQL, Python, Java, Scala, and ETL frameworks. Experience working with cloud platforms (AWS, Azure, GCP) and data storage solutions (e.g., Redshift, Snowflake, BigQuery). Solid understanding of data modeling, database design, and cloud data architectures. Hands-on experience with data warehousing concepts and tools (e.g., Apache Hive, Apache Spark). Familiarity with data orchestration tools such as Apache Airflow, Azure Data Factory. Knowledge of real-time data streaming technologies (e.g., Kafka, Kinesis) is a plus. Strong problem-solving skills and ability to troubleshoot complex data issues. Familiarity with data security best practices and data privacy regulations (e.g., GDPR, HIPAA).
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago

Talend Developer

Venpa Global Technologies Private Limited
experience6 to 9 Yrs
Salary12 LPA
location
Bangalore, Noida+4

Noida, Chennai, Hyderabad, Gurugram, Pune

skills
  • tac
  • etl
  • data
  • soap
  • studio
  • rest
  • talend
  • sql
  • integration
  • migration
  • overview
  • talend developer
Job Description
Talend Developer   (6 Years experience)  Upgrading Talend from 7.x to 8.0.1 involves not just infrastructure changes, but also project-level upgrades that require Talend Developers to adapt, refactor, and validate their ETL jobs. Below is a detailed overview of the skills, roles, and responsibilities of Talend Developers during this upgrade process: Key Skills for Talend Developers (Upgrade 7.3 8.x) Talend Studio Expertise Proficiency in Talend Studio for Data Integration/Big Data/ESB (based on edition). Familiarity with job designs, components, contexts, routines, and metadata repositories. Knowledge of repository-based project management and shared resources. Component Compatibility and Migration Understanding of changes to components and deprecated features in 8.x. Ability to replace deprecated components with newer alternatives. Experience in upgrading custom routines, tJava code blocks, and external JARs. Experience with Git, SVN, or TAC integrated version control. Knowledge of the Talend CommandLine (CI Builder) for automated builds in 8.x. Testing and Validation Expertise in unit testing, job-level validation. Skill in comparing job outputs and logs across 7.x and 8.x. Debugging and resolving issues caused by API, database driver, or job engine changes. Database and API Knowledge SQL scripting for data validation and comparison. Understanding of any REST/SOAP API calls used in jobs. Familiarity with data quality, data cleansing, and transformation logic. Roles and Responsibilities of Talend Developers Pre-Upgrade Assessment Review existing jobs for usage of deprecated or risky components. Tag jobs that need refactoring or heavy testing. Export and back up project sources from Talend Studio 7.x.  Project Upgrade Execution Open and migrate projects using Talend Studio 8.x. Resolve upgrade errors and component mapping warnings. Replace or reconfigure any unsupported or changed features.  Post-Upgrade Job Testing Performancetesting to ensure job outputs remain accurate. Compare run times and log outputs for performance or logic issues. Validate dependencies (e.g., database connectors, external services). Documentation and Collaboration Document all changes made to migrated jobs. Work with admins to troubleshoot TAC scheduling or job execution issues. Communicate with QA and data teams for test case validation. Implement reusable jobs and promote modular design using best practices. Recommend improvements in job design and monitoring post-upgrade.
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago

Sap Ewm Functional Consultant

CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
CAPGEMINI TECHNOLOGY SERVICES INDIA LIMITED
experience4 to 9 Yrs
location
Bangalore, Pune+1

Pune, Mumbai City

skills
  • integration
  • configuration
  • sap
  • wm
  • functional
  • ewm
  • process
  • management
  • warehouse
  • s/4hana
  • consultant
  • extended
Job Description
Job Title: SAP EWM Functional Consultant Location: Mumbai, Pune, Bangalore   Job Summary: We are seeking a skilled SAP EWM Functional Consultant to join our team. The candidate will be responsible for implementing and supporting SAP Extended Warehouse Management (EWM) solutions within the S/4HANA environment, ensuring efficient warehouse operations and seamless integration with supply chain processes.   Key Responsibilities: Analyze business requirements and translate them into functional and technical specifications. Configure and customize SAP EWM modules including inbound/outbound processes, internal warehouse movements, and physical inventory. Design and implement warehouse strategies such as putaway, picking, packing, and staging. Integrate EWM with other SAP modules like MM, SD, and TM. Conduct system testing, data validation, and support user acceptance testing (UAT). Utilize S/4HANA features and Fiori apps to enhance warehouse operations and user experience. Provide ongoing support, issue resolution, and continuous process improvements. Document functional designs, process flows, and configuration details.   Required Qualifications: Bachelors degree in Logistics, Supply Chain, Computer Science, or related field. Strong understanding of warehouse operations, inventory management, and logistics execution. Hands-on experience with SAP S/4HANA and EWM (embedded or decentralized).   Preferred Skills: Experience with RF framework, labor management, and integration with automation systems. Exposure to SAP Fiori/UI5 and OData services. Knowledge of SAP BTP and digital supply chain solutions. Excellent analytical and problem-solving skills. Strong communication and stakeholder engagement abilities.
INTERVIEW ASSURED IN 15 MINS
posted 1 week ago
experience5 to 10 Yrs
location
Bangalore, Chennai
skills
  • mstr
  • microstrategy developer
  • mstr developer
Job Description
We are currently hiring for a Senior MicroStrategy MSTR Developer role and your profile seems to match the requirement. Please find the Job Description below for your reference. Role Senior MicroStrategy Developer Location Chennai or Bangalore Notice Period 15 Days Experience 4 to 12 Years Job Description Key Skills and Experience 4 to 12 years of experience as a Senior MicroStrategy Developer responsible for designing developing and maintaining Business Intelligence solutions using the MSTR platform Ability to work independently and also guide team members when required Strong hands on experience in MicroStrategy Proficient in object creation virtual data structure and SQL Experience in developing high end dashboards with strong validation techniques Excellent documentation skills Good team player with strong attention to detail Responsibilities Responsible for designing and developing reports and dashboards Work closely with Delivery Lead MicroStrategy Tech Lead PSS and Infrastructure teams Develop reports and dashboards with rich UI functions and visualizations using the MicroStrategy reporting tool Analyze business requirements and provide inputs to the tech lead assist in creating end to end design including technical implementation Implement performance tuning techniques for MicroStrategy dashboards and reports review performance and recommend optimization techniques using VLDB settings and explain plan If the role interests you please share your updated resume along with your current CTC expected CTC and notice period. Thanks and RegardsCapgemini HR Team
INTERVIEW ASSURED IN 15 MINS
posted 2 weeks ago
experience10 to 20 Yrs
Salary20 - 30 LPA
location
Bangalore
skills
  • ar
  • r2r
  • sap hana administration
  • sap hana
  • ap
Job Description
SAP HANA Techno-Functional ConsultantJob Category: Job Title: SAP HANA Techno-Functional ConsultantJob Type: Full TimeJob Location: BangaloreSalary: 20-30lpaYears of Experience: 10-20yrsLinkedInFacebookTwitterEmailWhatsAppCopy LinkShareJob Overview: We are seeking a highly skilled SAP HANA Techno-Functional Consultant to join our team. The ideal candidate will have strong expertise in SAP RISE/GROW with Public Cloud, including both technical and functional aspects, to support business processes, system configurations, integrations, and automation. This role requires a deep understanding of SAP RISE/GROW with Public Cloud, ABAP development, Fiori/UI5, data migration, and ERP configurations to enhance system performance and user experience. This role requires strong expertise in SAP Finance (AR, AP, R2R, Banking, Assets, etc. and Controlling Modulesmodules, data migration, and process automation to support business objectives and enhance system capabilities.Key Responsibilities:Technical Responsibilities:Develop, customize, and optimize SAP RISE/GROW with Public Cloud solutions using ABAP, Fiori/UI5, and CDS Views.Conduct data migration activities, ensuring accuracy and system integrity during transitions.Design and implement SAP integrationsDevelop and manage SAP workflows, validations, and security configurations.Perform troubleshooting, debugging, and code optimization to improve system efficiency.Perform system upgrades, patches, and performance tuning for SAP environments.Functional Responsibilities:Work closely with finance teams to ensure smooth integration between SAP Finance (FI/CO) and other business modules.Analyze business workflows and propose SAP-based solutions for process improvement.Configure SAP modules, including Finance (FI), Controlling (CO), Material Management (MM), and Sales & Distribution (SD).Ensure compliance with SAP best practices, security policies, and audit requirements.Provide training and documentation for end-users and internal teams.Key Skills & Competencies:Minimum experience of 10+ Years in SAP HANA Public Cloud implementation and migration ( All Modules ).Knowledge of SAP Fiori/UI5, CDS Views, and AMDP.Expertise in ABAP development.Hands-on experience with SAP Workflow, SAP Security, and Role Management.Strong understanding of financial processes and integration with SAP modules.Experience with API-based integrations, middleware (SAP PI/PO, SAP CPI), and cloud-based solutions.Excellent problem-solving, analytical, and debugging skills.Strong communication and stakeholder management abilities.Education & Experience:Bachelors/masters degree in computer science, Information Technology, or related fields (BTech, MTech, MCA).10+ years of experience in SAP techno-functional roles.Experience with SAP HANA Public Cloud implementations, upgrades, and custom developments.Strong exposure to SAP Finance, SAP MM, SD, and system integrations.Note: Responsibilities
INTERVIEW ASSURED IN 15 MINS
posted 7 days ago
experience6 to 11 Yrs
location
Bangalore
skills
  • etl testing
  • etl
  • devops
  • queries
  • data warehousing
  • data stage etl
  • unix scripting
  • python
  • sql
  • control-m
Job Description
Job Description: DataStage Developer We are seeking an experienced DataStage Developer to design, build, and optimize data integration solutions across enterprise data platforms. The ideal candidate will have strong expertise in DataStage, data warehousing concepts, SQL, and automation using DevOps and scripting tools. Key Responsibilities Design, develop, and maintain ETL workflows using IBM DataStage in alignment with business and technical requirements. Work with large-scale data warehousing environments to support data integration, transformation, and loading processes. Write and optimize complex SQL queries for data extraction, validation, and performance tuning. Develop and maintain Unix shell scripts for automation, monitoring, and job orchestration. Utilize DevOps tools for version control, CI/CD, and deployment of ETL pipelines. Leverage Python for advanced data processing, automation, and integration tasks. Manage ETL scheduling, monitoring, and batch operations using Control-M. Collaborate with data architects, analysts, and cross-functional teams to ensure data quality, accuracy, and consistency. Troubleshoot ETL failures, performance issues, and data discrepancies in production and development environments. Document technical designs, data flows, and operational procedures. Required Skills & Qualifications Hands-on experience with IBM DataStage (Parallel & Server jobs). Strong understanding of data warehousing principles and ETL best practices. Proficiency in SQL with the ability to work on complex queries and tuning. Experience with Unix scripting, job automation, and server-side operations. Knowledge of Python for scripting and data manipulation. Familiarity with DevOps tools such as Git, Jenkins, or similar. Experience with Control-M or equivalent workload automation tools. Strong analytical, troubleshooting, and problem-solving abilities. Excellent communication skills and the ability to work collaboratively in cross-functional teams. Bachelors Degree mandatory.
INTERVIEW ASSURED IN 15 MINS
posted 2 months ago
experience6 to 11 Yrs
location
Bangalore, Chennai+3

Chennai, Navi Mumbai, Pune, Mumbai City

skills
  • sql
  • data
  • storage
  • git
  • factory
  • azure
  • databricks
  • engineer
  • pyspark
Job Description
We have an immediate opening for Azure Databricks with strong experience in Pyspark & SQL. Looking for the candidates who are available to join immediately or within 2 weeks. Interested candidates can share resumes to navyasree.kotha@deltassi.in   Role: ADB+ Pysark Job Type: Full-time (Client Payroll) Mode : WFO/ Hybrid Location : Bengaluru/ Chennai/ Pune/ Mumbai Experience : 6+ years Must have : Azure Databricks (ADB)+ Pyspark+SQL  Key Responsibilities: Develop and optimize ETL pipelines and data transformation workflows using Azure Databricks, PySpark, and SQL. Work with Azure Data Lake, Azure Synapse, and other Azure data services for data ingestion, storage, and analytics. Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver high-quality data solutions. Implement data cleansing, validation, and transformation logic in PySpark for structured and unstructured data. Write efficient SQL queries for data extraction, analysis, and reporting. Optimize Databricks jobs for performance and cost efficiency. Implement best practices in code management, version control (Git), and CI/CD pipelines for Databricks. Ensure compliance with data governance, security, and quality standards. Troubleshoot and resolve data processing issues in production environments. Required Skills and Experience: 3+ years of experience in data engineering or related roles. Strong hands-on experience with Azure Databricks and PySpark. Proficient in SQL query optimization, joins, window functions, and complex transformations. Experience with Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), and Azure Synapse Analytics. Familiarity with Delta Lake concepts (ACID transactions, time travel, etc.). Understanding of data modeling and ETL concepts. Proficient in Python for data manipulation and automation. Strong analytical and problem-solving skills. Experience with Agile/Scrum methodology and tools like JIRA or Azure DevOps.
INTERVIEW ASSURED IN 15 MINS
posted 3 weeks ago
experience0 to 4 Yrs
location
Bangalore, Karnataka
skills
  • Python
  • SQL
  • NoSQL
  • Git
  • Redis
  • Elasticsearch
  • Docker
  • AWS
  • GCP
  • Azure
  • Scrapy
  • BeautifulSoup
  • Puppeteer
  • Playwright
Job Description
Role Overview: As a Web Data Scraping Intern at Stylumia, you will play a vital role in designing and implementing data extraction, transformation, and loading processes. Your day-to-day tasks will involve creating and managing data models, building data warehouses, and analyzing datasets to support decision-making. By optimizing data workflows, you will contribute to the foundation of a secure and accurate demand-driven intelligence system. Key Responsibilities: - Develop and maintain web scraping scripts using Scrapy, BeautifulSoup, Puppeteer/Playwright. - Utilize SQL/NoSQL databases for storing, processing, and querying data. - Optimize scrapers for efficiency, reliability, and scalability. - Ensure data accuracy, validation, and cleaning. - Collaborate with engineering and data teams to integrate scraped data into downstream systems. - Explore and implement the latest scraping tools and techniques. Qualifications Required: - Strong programming skills in Python (Node.js is a plus). - Experience with Scrapy and other scraping frameworks. - Familiarity with SQL, Git, Redis, Elasticsearch. - Problem-solving mindset with attention to detail. - Good communication and teamwork skills. - Currently pursuing or completed a degree in CS, IT, or related fields. Additional Details: Stylumia, founded in 2015, is dedicated to revolutionizing retail strategy by reducing wastage through advanced intelligence. The company's solutions, such as demand science and real-time consumer intelligence, drive revenue growth for brands and retailers globally. Partnering with Stylumia offers the opportunity to work on real-world, large-scale data projects, receive mentorship from experienced data engineers, and gain hands-on exposure to modern data tools and pipelines.,
ACTIVELY HIRING
posted 2 weeks ago
experience3 to 7 Yrs
location
Bangalore, Karnataka
skills
  • Database Design
  • Clinical Data Services
  • Clinical EDC Build
  • Edit Check Programming
  • Custom Function Programming
  • IVRSRWS Integration
  • Lab Administration
Job Description
As a Clinical Data Services Analyst at Accenture, you will be involved in the Clinical Data Management team focusing on the collection, integration, and availability of data at appropriate quality and cost. Your role will include performing data management activities such as discrepancy review, query generation, and resolution. You will also be responsible for creating CRF Completion Guidelines (CCG) and SAE reconciliation guidelines. Additionally, you will help in identifying and raising protocol deviations in the database, performing edit check validation, writing test scripts, and carrying out database validation (UAT) against the specified CRF/ECRF. Your responsibilities will also involve managing clinical data management projects. **Key Responsibilities:** - Develop clinical study databases by building electronic case report forms and programming edit checks as per specifications. - Support any updates or changes to the study database (e.g., Protocol amendments) through the change control process. **Qualifications Required:** - BSc/Master of Pharmacy - 3 to 5 years of experience - Language proficiency in English (International) at an expert level About Accenture: Accenture is a global professional services company with leading capabilities in digital, cloud, and security. With over 699,000 employees serving clients in more than 120 countries, Accenture offers Strategy and Consulting, Technology and Operations services, all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. The company embraces the power of change to create value and shared success for its clients, people, shareholders, partners, and communities. For more information, visit www.accenture.com. In your role, you will be aligned with the Life Sciences R&D vertical at Accenture, where services span across the entire life sciences enterprise. Employees in this vertical work on various sub-offerings like Clinical, Pharmacovigilance & Regulatory, helping leading biopharma companies bring their vision to life by improving outcomes through patient-centric approaches and scientific expertise. **What We Are Looking For:** - Adaptable and flexible individuals - Ability to perform under pressure - Strong problem-solving skills - Detail-oriented approach - Capability to establish strong client relationships If you are a Clinical Database Developer with over 4 years of experience and exposure to EDC platforms like Medidata RAVE EDC, Inform, Oracle Clinical, Veeva, etc., this role is ideal for you. You should have extensive experience in database design, edit check programming, custom function programming, and unit testing. Certification in Medidata RAVE study builder would be preferred. Additionally, experience in various modules such as IVRS/RWS integration and Lab administration will be beneficial.,
ACTIVELY HIRING
posted 6 days ago

Data Quality Analyst

4CRisk.ai Software
experience1 to 5 Yrs
location
Bangalore, Karnataka
skills
  • Data validation
  • Data profiling
  • Data modeling
  • Statistical analysis
  • Documentation
  • Communication skills
  • Data Quality analysis
  • Rootcause analysis
  • Data slicing
  • Data lineage
  • Data Validation Testing
  • Data querying
Job Description
As a Data Quality Analyst at 4CRisk.ai Software Private Ltd. in Bangalore, India, you will play a crucial role in utilizing regulatory data to drive product decisions. By collaborating with cross-functional teams, your expertise will be instrumental in delivering customer insights and shaping the products offered to our customers. Your work will involve leveraging rich user data through cutting-edge technology to transform your insights into tangible products. Key Responsibilities: - Perform statistical tests on large datasets to assess data quality and integrity. - Evaluate system performance and design to ensure data quality standards are met. - Collaborate with AI and Data Engineers to enhance data collection and storage processes. - Run data queries to identify quality issues and data exceptions, as well as clean the data. - Gather data from primary and secondary sources to identify trends and interpret them. - Report data analysis findings to management for informed business decisions and prioritization of information system needs. - Document processes and maintain data records following best practices in data analysis. - Stay updated on developments and trends in data quality analysis. Required Experience/Skills: - Must have experience in data quality analysis including root-cause analysis and data slicing. - Design, build, and execute data quality plans for complex data management solutions on modern data processing frameworks. - Understand data lineage and prepare validation cases to verify data at each stage of the processing journey. - Develop dataset creation scripts for verification at extraction, transformation, and loading phases. - Support AI and Product Management teams by contributing to the development of data validation strategies. - Document issues and collaborate with data engineers to maintain quality standards. - Efficiently capture business requirements and translate them into functional specifications. - Experience in Data Profiling, Data Modeling, and Data Validation Testing is a plus. - 1 to 3+ years of proven experience in a similar role. - Excellent presentation and communication skills in English, both oral and written, for effective interaction with management and customers. - Ability to work collaboratively with team members globally and across departments. Location: - Bangalore, India,
ACTIVELY HIRING
posted 2 months ago
experience5 to 9 Yrs
location
Bangalore, Karnataka
skills
  • Wireshark
  • Analytical skills
  • Communication skills
  • NI Test Stand
  • Signalling systems
  • Locomotive components
  • Problemsolving skills
Job Description
Role Overview: At Capgemini Engineering, a global leader in engineering services, you will join a team of engineers, scientists, and architects dedicated to helping innovative companies reach their full potential. As a Functional Validation Engineer based in Bangalore, you will play a crucial role in validating complex systems and subsystems in the signalling and locomotive domain. Your work will involve collaborating with engineering and QA teams to ensure functionality, reliability, and compliance with industry standards. Key Responsibilities: - Perform functional validation activities using NI Test Stand and Wireshark. - Validate signalling systems and locomotive components to ensure operational integrity. - Develop and execute test cases based on system requirements and specifications. - Analyze test results and troubleshoot issues to support system improvements. - Collaborate with cross-functional teams to enhance validation coverage and efficiency. - Document validation processes and contribute to internal knowledge-sharing initiatives. Qualifications Required: - 4-9 years of experience in functional validation or system testing. - Hands-on experience with NI Test Stand and Wireshark. - Exposure to signalling and locomotive domain technologies. - Strong analytical and problem-solving skills. - Effective communication and collaboration abilities. - Bachelors degree in Engineering, Computer Science, or a related field. Additional Company Details: Capgemini is a global business and technology transformation partner with a strong heritage of over 55 years. With a diverse team of 340,000 members in more than 50 countries, Capgemini is trusted by clients to deliver end-to-end services and solutions leveraging expertise in AI, cloud, data, and more. The company is committed to accelerating the transition to a digital and sustainable world, creating tangible impact for enterprises and society. As a Capgemini employee, you can enjoy flexible work arrangements, an inclusive culture that fosters innovation and collaboration, and access to continuous learning and certifications in emerging technologies.,
ACTIVELY HIRING
posted 3 weeks ago
experience7 to 11 Yrs
location
Bangalore, Karnataka
skills
  • Power BI
  • SQL
  • Azure DevOps
  • SNOW
  • Qlik
Job Description
As a BI Tech Lead specializing in Production Support, you will have a minimum of 7-8 years of experience in Business Intelligence, with a specific focus on Power BI for at least 6 years. Your expertise will be crucial in handling support activities post the production phase. Your immediate joining requirement indicates the urgent need for your skills in Power BI, Qlik, SQL, Azure DevOps, and SNOW. Your understanding of Azure DevOps and SQL will be essential, while even a basic grasp of Qlik will be beneficial. Your qualifications should include a Bachelor's or Master's degree in Computer Science, Engineering, or a related field. You must possess advanced expertise in Power BI for both development and administration tasks. Strong data validation skills across the BI landscape, expert knowledge of Qlik (both QlikView and QlikSense), and a solid grasp of data operations and performance tuning are essential. Preferably, you should also have knowledge of Azure DevOps. In terms of communication and problem-solving skills, you are expected to excel in both areas. With a minimum of 8 years of hands-on experience in BI visualization tools, including designing and delivering complex dashboards, you should possess strong debugging skills. Additionally, having administrative knowledge of tools is strongly preferred. Your advanced visualization skills should enable you to design custom reports effectively. Your language skills must include fluency in English. Proficiency in Power BI, including advanced skills in report development and administration, as well as expertise in Power BI Report Builder for paginated reports, is required. Familiarity with Qlik tools (QlikView, QlikSense, and N-Printing), ServiceNow for ticket management and service operations workflows, and Azure DevOps is preferred. Proficiency in Microsoft Office, especially Advanced Excel, and basic knowledge of databases for writing and optimizing SQL analytical queries are essential. Experience with any cloud platforms like Azure or AWS will be beneficial. The role will involve tasks such as data extraction in SQL, managing reports in Power BI service, writing DAX queries, data wrangling in Power Query, and troubleshooting data reports. Your responsibilities will also include working on Azure, Power BI service, and on-premises Gateway services. Your ability to handle these tasks efficiently will be crucial in ensuring the smooth operation of the BI landscape and supporting the production phase effectively.,
ACTIVELY HIRING
posted 2 weeks ago

Java Developer

42SIGNS INFOSYSTEMS LLP
experience5 to 10 Yrs
Salary9 - 18 LPA
location
Bangalore
skills
  • java
  • as400
  • j2ee
Job Description
Company name: Tramway Inc (http://www.tranwayinc.com/), Working for Client-Infosys( work location)Experience: 5+ Years Location: Bangalore, Electronic City Java developer is responsible for compete end to end web development includes UI, Business and Data layer. Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. Adherence to the organizational guidelines and processes. Primary focus will be on the latest Java technology with J2EE, Spring, REST APIs. You will be working with other engineers and developers on different layers of the infrastructure therefore, commitment to collaborative problem solving, next generation design, and creating quality products is essential. Design, develop, and maintain AS400 applications. Analyze business requirements and translate them into technical solutions. Modify and enhance existing AS400 programs. Perform unit testing and resolve software defects. Provide technical support and troubleshooting for AS400 systems Plan for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline Mandatory skill: Java, J2EE, AS400 HTML, CSS, Javascript REST APIs integration with Frontend DevOps with Jenkins, GitHub, Kubernetes, Docker If you are interested in this role, Please share your updated resume to prerna@42signs.com or 8197 840 980.
posted 2 months ago

Senior Ios Developer

CYANOUS SOFTWARE PRIVATE LIMITED
CYANOUS SOFTWARE PRIVATE LIMITED
experience10 to 12 Yrs
Salary26 - 36 LPA
WorkContractual
location
Bangalore
skills
  • restful apis
  • swift
  • ios development
  • mvc
Job Description
Position: iOS Developer Technical Validation Experience: 10+ Years Location: Bangalore Budget: Up to 36 LPA Mode: [Specify Onsite / Hybrid / Remote] Job Overview We are looking for a highly experienced iOS Developer with over a decade of hands-on expertise in designing, developing, and validating complex iOS applications. The ideal candidate should have strong proficiency in Swift, SwiftUI, and modern iOS frameworks, coupled with a deep understanding of architecture patterns, API integration, and app lifecycle management. Key Responsibilities Lead the technical validation and architecture review of iOS applications. Design, develop, and optimize advanced applications for the iOS platform. Review and guide the team on best practices in Swift, SwiftUI, and Combine. Ensure seamless integration with RESTful APIs and data storage frameworks. Drive UI/UX consistency across multiple devices (iPhone, iPad). Oversee code reviews, testing strategies, and app deployment pipelines. Implement security best practices for data protection and API key management. Mentor and guide junior developers in code quality and app performance optimization. Technical Validation Focus Areas 1. Core iOS Development Explain the iOS app lifecycle clearly and identify key transitions. Distinguish between ViewController, View, and AppDelegate responsibilities. Manage background tasks or long-running operations efficiently. 2. Programming Language Expertise Deep understanding of Swift (Objective-C experience is a plus). Differentiate between structs vs. classes in Swift. Strong grasp of optionals and safe unwrapping techniques. 3. Architecture & Frameworks Hands-on experience with MVC, MVVM, or VIPER architectures. Exposure to SwiftUI and Combine, and how they differ from UIKit. Implementation experience with Dependency Injection and design patterns (Singleton, Observer). 4. Networking & Data Handling Expertise in API communication using URLSession, Alamofire, or similar. Data persistence experience using Core Data, Realm, or UserDefaults. 5. UI & Modern Development Proven experience building UI with SwiftUI. Knowledge of Auto Layout, Adaptive UI, and device responsiveness. 6. Testing, Deployment & Security Experience with XCTest, Quick/Nimble for unit and UI testing. Strong understanding of App Store deployment processes and CI/CD tools. Security best practices for sensitive data protection and keychain usage. Required Skills Swift, SwiftUI, Combine, UIKit, Objective-C (optional) RESTful APIs, JSON Parsing, Core Data, Realm MVC / MVVM / VIPER Architecture Xcode, Git, Jenkins / Fastlane (preferred) App Lifecycle & Background Processing Unit Testing, UI Testing, Security & Keychain Management Educational Qualification Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Why Join Us Opportunity to work on cutting-edge iOS technologies. Strong focus on architecture, scalability, and app performance. Competitive compensation up to 36 LPA. Dynamic work environment with a focus on innovation and technical excellence.
posted 2 days ago

Quality Assurance Engineer

CONCEPTS GLOBAL Hiring For AI/ML company
experience4 to 9 Yrs
Salary10 - 16 LPA
WorkRemote
location
Bangalore, Noida+7

Noida, Chennai, Hyderabad, Kolkata, Gurugram, Pune, Mumbai City, Delhi

skills
  • validation
  • artificial intelligence
  • testing
  • quality assurance engineering
  • software quality assurance
  • machine learning
  • test automation
  • quality assurance
Job Description
Position: Quality Assurance Engineer (AI/ML) Work Mode: Remote Employment Type: Permanent Salary: Best in the industry   Job Responsibilities: Develop and execute test plans for AI/ML models and data pipelines Build and maintain Playwright automation (TypeScript/JavaScript) Validate AI model accuracy and monitor model drift Perform data quality checks and support microservices testing Integrate automation into CI/CD pipelines Troubleshoot issues and contribute to process improvement Prepare test documentation and quality reports   Required Skills & Experience: 4+ years of QA experience with 2+ years in automation Strong hands-on experience with Playwright Experience in testing AI/ML systems and ML evaluation metrics Knowledge of Python, JavaScript/TypeScript, and API testing Familiarity with microservices, Docker, Kubernetes, Git, CI/CD Good communication, analytical skills, and Agile experience
posted 6 days ago

databricks developer

Vy Systems Private Limited
experience4 to 8 Yrs
Salary50,000 - 3.5 LPA
WorkRemote
location
Bangalore, Chennai+7

Chennai, Noida, Hyderabad, Gurugram, Kolkata, Pune, Mumbai City, Delhi

skills
  • sql
  • aws
  • databricks
Job Description
Job Title: Databricks Developer Experience: 48 Years Location: Remote Job Summary We are looking for an experienced Databricks Developer with strong expertise in PySpark, SQL, Python, and hands-on experience deploying data solutions on AWS (preferred) or Azure. The ideal candidate will design, develop, optimize, and maintain scalable data pipelines and analytics solutions using the Databricks unified data platform. Key Responsibilities Design, build, and optimize ETL/ELT pipelines using Databricks and PySpark. Develop scalable and high-performance data processing workflows on AWS (EC2, S3, Glue, Lambda, IAM) or Azure (ADF, ADLS, Synapse). Implement and manage Delta Lake for ACID transactions, schema evolution, and time-travel data needs. Write efficient and complex SQL queries for data transformation, validation, and analytics. Build and maintain data ingestion frameworks, including streaming (Kafka, Kinesis, EventHub) and batch processing. Optimize Databricks clusters, jobs, workflows, and performance tuning PySpark code. Collaborate with data engineers, analysts, and product teams to define data requirements and deliver high-quality solutions. Ensure data governance, security, compliance, and best practices across cloud environments. Troubleshoot production pipelines and support CI/CD deployments for Databricks jobs. Required Skills & Experience 48 years of experience in Data Engineering or Big Data development. Strong hands-on experience with Databricks (clusters, jobs, notebooks, workflows). Advanced proficiency with PySpark for batch/stream processing. Strong programming skills in Python. Expertise in SQL (complex transformations, window functions, optimization). Hands-on experience working with AWS (preferred) or Azure cloud services. Experience with Delta Lake, parquet, and data lake architectures. Familiarity with CI/CD pipelines (GitHub Actions, Azure DevOps, Jenkins, etc.). Good understanding of data modeling, performance tuning, and distributed computing.  APPLY: sanjai@vyystems.com
logo

@ 2025 Shine.com | All Right Reserved

Connect with us:
  • LinkedIn
  • Instagram
  • Facebook
  • YouTube
  • Twitter