58

Kafka Engineer Jobs

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • 3 - 5 yrs
  • 20.0 Lac/Yr
  • Mohali Sector 66
SQL Akka Scala Programming Scalability Concurrency Distributed Systems Functional Programming Kafka Git Microservices Play Framework Linux Spark Sbt RESTful APIs AWS Testing Java Mongo DB Nosql Mongo
Key Responsibilities: Design, develop, and maintain applications usingScalaand related frameworks such asAkkaincluding Akka HTTP, Akka Actor, Akka Stream. Build and integrateRESTful APIsto support scalable and high-performance applications. Work withmessaging queues: Kafkafor event-driven and distributed system communication. Develop and manage data solutions usingSQL and NoSQL databases, ensuring efficient datastorage and retrieval,Mongo DB is must. Implement and manageElasticsearchfor search functionality, indexing, and performanceoptimization. Applysoftware design patterns and development principlesto ensure clean, scalable, andmaintainable code. Collaborate with cross-functional teams in anAgile environment, participating in sprint planning,development, and review meetings. UseGit for version controland follow best practices for code quality and documentation. Analyze, debug, and troubleshoot technical issues to deliver reliable and efficient solutions.Requirements: 3-5 years of hands-on experience in Scala development. Familiarity with Akka framework and REST API development. Exposure to SQL and NoSQL databases. Understanding of software design principles and patterns. Knowledge of Git or other version control tools. Strong problem-solving and analytical skills. Ability to work collaboratively in a team-oriented environment.
View all details
  • 5 - 11 yrs
  • 25.0 Lac/Yr
  • Bangalore
Apache Kafka Azure Grafana Data Warehousing
Role: Data Engineer 2.0Location - RemoteExperience: Min. 5 YearsNotice Period- Immediate to 15 Days or serving notice periodKey Responsibilities:Design and implement manual test strategies for real-time streaming use cases using Azure Service Bus, Event Hubs, Kafka, and Azure Functions.Validate Spark Streaming applications, including unbounded data flows, streaming DataFrames, checkpoints, and streaming joins.Develop test plans for containerized microservices deployed on Kubernetes, ensuring scalability and fault tolerance.Test data ingestion and transformation workflows across open table formats like Delta Lake, Apache Iceberg, and Hudi.Good to Have:Monitoring and troubleshooting system performance using observability stacks such as Prometheus, Grafana, and ELK.functional and performance testing on analytical databases and query engines such as Trino, StarRocks, and ClickHouse.Testing and validation of data products designed under data mesh architecture, ensuring domain-oriented data quality and governance.
View all details

Looking For Data Engineer

BSRI Solutions Pvt Ltd

  • 3 - 5 yrs
  • 16.0 Lac/Yr
  • Chennai
Python Pyspark Developer Scala SQL Hive Hadoop Google Cloud Platform Kafka Developer Infrastructure AS Code GitHub Agile Methodology ETL
Required Qualifications : 3+ years of demonstrated ability with Hive, Python, Spark/Scala, SQL, etc. Google Cloud Platform Experience, Big Query, Cloud Storage, Dataproc, Data Flow, Cloud Composer, Cloud SQL, Pub Sub, Terraform, etc. Experience with Hadoop Ecosystem, Kafka, PCF cloud services Familiar with big data and machine learning tools and platforms Experience with BI tools, such as Alteryx, Data Stage, QlikSense, etc. Design data pipelines and data robots, take a vision and bring it to life Master data engineer; mentors others; works closely with IT architects to set strategy and design projects Provide extensive technical, and strategic advice and guidance to key stakeholders around the data transformation efforts Redesign data flows to prevent recurring data issues Strong analytical and problem-solving skills Possess excellent oral and written communication skills, as well as facilitationand presentation skills, and engaging presentation style. Ability to work as a global team member, as well as independently, in achanging environment and prioritize. Ability to establish and maintain coordinated and effective working relationships with application implementation teams, IT project teams, business customers, and end users. Ability to deliver work within deadlines. Experience with agile/lean methodologies Experience working independently and with minimal supervision Experience with Test Driven Development and Software Craftsmanship Experience with GitHub, Accurev, or other version-control systems Experience with Putty Experience with Datastage Strong Communications skills Ability to illustrate and convey ideas and prototypes effectively with team and partners Presence demonstrating confidence, ability to learn quickly, influence, and shape ideas Key Skills Required - Data Engineer- Python / PySpark / Scala- SQL & Hive- Hadoop Ecosystem- Data Pipeline Design & ETL Development- Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage)- Kafka / Streaming Data Processing- Terraform (Infrastructure as Code)- DataStage or Similar ETL Tools- Version Control (GitHub or equivalent)- Agile Methodologies- Strong Analytical & Problem-Solving Skills- Stakeholder Collaboration & CommunicationNice to Have:- Cloud Composer, Cloud SQL, Pub/Sub- BI Tools (Alteryx, QlikSense)- Machine Learning Platform Exposure- Test Driven Development (TDD)- Mentoring & Technical Leadership
View all details

Looking For Data Architect

Toolify Private Limited

  • 9 - 15 yrs
  • 40.0 Lac/Yr
  • Jaipur
Data Architect Databricks Developer Apache Spark Delta Lake Azure Synapse Azure Data AWS Redshift AWS Glue SQL Pyspark Developer Kafka Engineer Big Data
Job SummaryWe are seeking a skilled Data Architect to lead the design and implementation of high-performance, scalable data platforms. This role involves architecting modern data lakes, warehouses, and streaming systems using Databricks and cloud technologies. If you enjoy solving complex data challenges and driving data-driven decision-making, this role is for you.Key ResponsibilitiesDesign and implement scalable data lakes, data warehouses, and real-time streaming architecturesBuild, optimize, and manage Databricks solutions using Spark, Delta Lake, Workflows, and SQL AnalyticsDevelop cloud-native data platforms on Azure (Synapse, Data Factory, Data Lake) and AWS (Redshift, Glue, S3)Create and automate ETL/ELT pipelines using Apache Spark, PySpark, and cloud toolsDesign and maintain data models (dimensional, normalized, star schemas) to support analytics and reportingLeverage big data technologies such as Hadoop, Kafka, and Scala for large-scale data processingEnsure data governance, security, and compliance with standards like GDPR and HIPAAOptimize Spark workloads and storage for performance and cost efficiencyCollaborate with engineering, analytics, and business teams to align data solutions with organizational goalsRequired Skills & Qualifications8+ years of experience in Data Architecture, Data Engineering, or AnalyticsStrong hands-on experience with Databricks (Delta Lake, Spark, MLflow, Pipelines)Expertise in Azure (Synapse, Data Factory, Data Lake) and AWS (Redshift, S3, Glue)Proficient in SQL and Python or ScalaExperience with NoSQL databases (e.g., MongoDB) and streaming platforms (e.g., Kafka)Solid understanding of data governance, security, and compliance best practicesExcellent problem-solving, communication, and cross-functional collaboration skills.Looking forward to receiving suitable profiles at the earliest.
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

Python Developer - Full Time

DifferentByte Technologies

  • 1 - 3 yrs
  • Kochi
Python FastAPI Django AWS Kafka Airflow Flask
Skills & Requirements:Sound knowledge in Python (FastAPI)AWSPostgreSQL / SupabaseQueuing systems (Kafka, Airflow)Knowledge in GEN-AI, ML, DL, TensorFlow, PyTorchKey Responsibilities: Write clean, scalable, and efficient codeDevelop back-end components to improve responsiveness and performanceIntegrate user-facing elements into applicationsTest and debug programsEnhance the functionality of existing systemsImplement robust security and data protectionAssess and prioritize new feature requestsCollaborate with internal teams to understand requirements and deliver effective solutions
View all details
  • 5 yrs
  • 25.0 Lac/Yr
  • Hyderabad
Java Spring Boot Python Microservices REST GraphQL GCP GKE Kubernetes Kafka Postgres CICD GitHub Actions Argo CD MLOps Vertex AI Kubeflow NLP GenAI
Java Developer - Offshore - remote it would be a plus if candidate is really strong in building MCP Develop and maintain backend microservices using Python, Java and Spring Boot Build and integrate APIs (both GraphQL and REST) for scalable service communication Deploy and manage services on Google Cloud Platform (GKE) Work with Google Cloud Spanner (Postgres dialect) and pub/sub tools like Confluent Kafka (or similar) Automate CI/CD pipelines using GitHub Actions and Argo CD Design and implement AI-driven microservices Collaborate with Data Scientists and MLOps teams to integrate ML Models Implement NLP pipelines Enable continuous learning and model retraining workflows using Vertex AI or Kubeflow on GCP Enable observability and reliability of AI decisions by logging model predictions, confidence scores and fallbacks into data lakes or monitoring toolsRequired Qualifications 5+ years of backend development experience with Java and Spring Boot 2+ years working with APIs (GraphQL and REST) in microservices architectures 2+ years experience integrating or consuming ML/AI models in production environments (e.g. RESTful ML APIs, TensorFlow Serving or Vertex AI Endpoints)(e.g. Rx Claim metadata, clinical documents, NLP processing). Familiarity with ML model lifecycle - from data ingestion, training, deployment, to real-time inference (MLOPS) 2+ years hands-on experience with GCP, AWS, or Azure 2+ years working with pub/sub tools like Kafka or similar 2+ years experience with databases (Postgres or similar) 2+ years experience with CI/CD tools (GitHub Actions, Jenkins, Argo CD, or similar)Preferred Qualifications Hands-on experience with Google Cloud Platform Familiarity with Kubernetes concepts; experience deploying services on GKE is a plus Strong understanding of microservice best practices and distributed systems Familiarity with Vertex AI, Kubeflow or similar AI platforms on GCP for model training and serving Understanding of GenAI use cases, LLM prompt engineering and agentic orchestration (e.g. LangChain, transformers) Experience deploying Python-based ML Services into Java microservice ecosystems (via REST, gRPC or sidecar patterns) Knowledge of claim adjudication, Rx domain logic or healthcare specific workflow automationEducation Bachelors degree or equivalent experience (High School Diploma and 4 years relevant experience)
View all details

Java Backend Developer (8-10 Years)

Infogine Solutions Pvt Ltd

  • 8 - 10 yrs
  • 10.0 Lac/Yr
  • Pune
Spring Boot Kafka Developer Hibernate Java8 Core Java Springboot Microservices Kafka Spring Boot Developer Spring Hibernate
Must have skills on Java8,Core Java, Springboot, Microservices, Hibernate, Kafka
View all details

Full Stack Developer

KayJay Global Solutions Pvt.Ltd

  • 1 - 5 yrs
  • Udaipur
Mongo DB My SQL PERN Postgre MERN Express Js Github React Js Nodejs Typescript WFO Kafka Developer Kafka Engineer
About the Role: We are hiring an experienced MERN Stack Developer who can independently build and scale full-stack applications from scratch. The ideal candidate must have strong skills in TypeScript, React.js, Node.js, Redux, and REST APIs, along with a solid understanding of microservices architecture, mobile API integrations, and responsive design. Experience with LLMs, RAG architecture, Kafka, Redis, webhooks, and database systems like PostgreSQL, MySQL, and MongoDB is essential. Youll be working on scalable systems, enterprise-grade platforms, and AI-integrated products. Key Responsibilities: Develop and maintain full-stack web applications using React.js (with TypeScript) and Node.js (Express/NestJS) Build and document robust RESTful APIs and mobile-ready endpoints Implement and consume webhooks for third-party integrations and internal eventbased workflows Ensure responsive design principles across devices and screen sizes Integrate with mobile applications via secure and scalable APIs Work with databases including PostgreSQL, MySQL, and MongoDBIntegrate Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) pipelines Build and deploy microservices architecture using Kafka, Redis, Docker, etc. Manage application state using Redux or Redux Toolkit Write test cases using Mocha, Chai, Jest, or similar frameworks Participate in system design, architecture discussions, and scalability planning Follow Agile practices, perform code reviews, and contribute to CI/CD pipelines Collaborate with cross-functional teams including UI/UX, DevOps, QA, and Product Management. Required skills and experience: Minimum 3 years of hands-on experience with the MERN stack Strong proficiency in TypeScript, React.js, Node.js, and Redux Proven experience in building and consuming REST APIs and webhooks Deep knowledge of responsive UI/UX development Hands-on experience with mobile-first API design and integrations Strong knowledge of PostgreSQL, MySQL, and MongoDB Experience with Kafka, Redis, Docker, and microservices Exposure to LLM, GPT, RAG, or similar AI models Solid understanding of software architecture and ability to build systems from the ground up. Working knowledge of API security, JWT, OAuth, and rate-limiting Familiar with test-driven development and QA practices Familiarity with Git, version control, and collaborative workflows Good to Have: Experience with GraphQL and gRPC Experience in building SaaS or AI-integrated platforms Familiarity with ElasticSearch or RabbitMQ Experience with AWS, GCP, or Azure Understanding of serverless and event-driven architectures Benefits: Competitive compensation with performance-based hikes Flexible remote or hybrid work options Opportunity to work on cutting-edge technologies including AI and scalable SaaS platforms Strong leadership and learning-driven environment Learning budget for courses, certifications, and conferences
View all details

Senior Kafka Engineer (full Time)

Hirebee Resourcing Pvt Ltd

  • 6 - 10 yrs
  • 1.8 Lac/Yr
  • Bangalore +1 Pune
SQL Azure Java Python AWS GCP Developer
Job Description Senior Kafka EngineerPosition: Senior Kafka EngineerExperience: 7+ YearsLocation: Bangalore / PuneOpen Positions: 1About the RoleWe are seeking a highly skilled Senior Kafka Engineer with deep expertise in Apache Kafka to design, implement, and support high-performance, scalable, and reliable messaging/streaming solutions. The ideal candidate will have extensive experience in owning end-to-end Kafka solutions, troubleshooting complex issues, and collaborating with cross-functional teams to ensure seamless data streaming architectures.Key ResponsibilitiesDesign & Architecture:Architect, implement, and optimize large-scale Kafka solutions.Define and implement best practices for Kafka cluster design, monitoring, and security.Work on high-throughput and low-latency streaming systems.Administration & Operations:Install, configure, and maintain Kafka clusters in production and non-production environments.Manage Kafka brokers, Zookeepers (or KRaft mode), schema registry, and connectors.Perform capacity planning, performance tuning, and monitoring of clusters.Troubleshooting & Support:Troubleshoot and resolve production issues related to Kafka.Implement proactive monitoring, alerting, and recovery strategies.Provide 24x7 support for business-critical streaming workloads (as required).Collaboration & Development:Partner with application developers, DevOps, and data engineering teams to integrate Kafka into enterprise solutions.Develop and manage Kafka topics, producers, consumers, and streaming pipelines.Create technical documentation and provide knowledge transfer to teams.Required Skills & ExperienceCore Expertise:7+ years of hands-on experience with Apache Kafka in production environments.Strong expertise in Kafka design, cluster administration, security, monitoring, and troubleshooting.Proven ability to design scalable, high-availability, and fault-tolerant Kafka architectures.Technical Skills:Strong knowledge of Kafka Connect, Kafka Streams, Schema Registry, and REST Proxy.Hands-on experience with Zookeeper / KRaft mode and cluster migrations.Familiarity with DevOps tools for deployment and monitoring (Prometheus, Grafana, Confluent Control Center, etc.).Proficiency in Java / Scala / Python for developing Kafka producers and consumers.Knowledge of cloud platforms (AWS/MS Azure/GCP) and container orchestration (Kubernetes, Docker) is a plus.Soft Skills:Excellent problem-solving and troubleshooting skills.Strong communication and stakeholder management abilities.Ability to work independently and collaboratively in fast-paced environments.Good to HaveExperience with Confluent Kafka Platform.Knowledge of real-time data pipelines, event-driven architecture, and microservices integration.Exposure to NoSQL databases, big data ecosystems (Hadoop, Spark, Flink).
View all details

Opening For Golang Developer

Smartcore Solutions

  • 6 - 10 yrs
  • 25.0 Lac/Yr
  • Electronic City Bangalore
JIT Jira Reactjs API Golang Postgre SQL Python Mongodb Kafka Developer
JD: Golang Developer(ReactJs + Go)Job Title: Software EngineerLocation: Bangalore (Work from Office) Full-TimeExperience: 4 - 5 YrsEmployer: (US Client Project)About the Role Join a fast-paced engineering team working on a global-scale product for a US-basedclient. You ll be responsible for building and maintaining scalable web applications using modernfront-end and back-end technologies. This role demands strong expertise in ReactJS, Go, andPostgreSQL, along with a strong emphasis on writing clean, high-performance code.What You'll Work On Building responsive front-end interfaces with ReactJS Developing scalable APIs and microservices using Go Managing efficient builds and automation with Bazel Designing and optimizing relational data models in PostgreSQL Collaborating through Git, Jira, and SlackMust-Have Skills 4+ years in full-stack development 3+ years with Go (Golang) Strong ReactJS experience SQL database proficiency (PostgreSQL preferred)Good-To-Have Skills Redux, and modern JavaScript (ES6+) Gin Web Framework GORM ORM library Docker, Bazel Git, Jira, Jenkins CI/CD pipelinesBonus Skills Python, Django, MongoDB, Slack, Kafka, Protobufs, Prometheus, Vite
View all details
  • 8 - 10 yrs
  • Pune
Kafka Scala Spark Hadoop Airflow Data Lakes Kappa Kappa ++ Architectures RDBMS NoSQL Cassandra Redis Oracle
Sr. Big Data Engineer Location: PuneExperience: 10+ years Mode: HybridRole Overview:We are seeking a talented Sr. Big Data Engineer to design, develop, and support a highly scalable, distributed SaaS-based Security Risk Prioritization product. You will lead the design and evolution of our data platform and pipelines, providing technical leadership to a team of engineers and architects.Key Responsibilities: Provide technical leadership on data platform design, roadmaps, and architecture. Design and implement scalable architecture for Big Data and Microservices environments. Drive technology explorations, leveraging knowledge of internal and industry prior art. Ensure quality architecture and design of systems, focusing on performance, scalability, and security. Mentor and provide technical guidance to other engineers.Required Skills & Technologies: Mandatory: Kafka, Scala, Spark. Big Data & Data Streaming: Spark, Kafka, Hadoop, Presto, Airflow, Data lakes, lambda architecture, kappa, and kappa ++ architectures with flink data streaming. Databases & Caching: RDBMS, NoSQL, Oracle, Cassandra, Redis. Search Solutions: Solr, Elastic. ML & Automation: Experience with ML models engineering and related deployment, scripting, and automation. Architecture: In-depth experience with messaging queues and caching components. Other Skills: Strong troubleshooting and performance benchmarking skills for Big Data technologies.Qualifications: Bachelors degree in Computer Science or equivalent. 8+ years of total experience, with 6+ years relevant. 2+ years in designing Big Data solutions with Spark. 3+ years with Kafka and performance testing for large infrastructure.
View all details

Looking For AWS Engineer

Kudos Technolabs

Spring Boot Developer Microservices Kafka Lambda DynamoDB SQS SNS S3 ECS EC2 AWS AWS Engineer
Job Location:Gurugram, Bangalore, Hyderabad Job Experience Level:6+ yearsJob Description:We are looking for a candidate with5 Years of hands-on experience with AWS services(Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2) (mandatory in each service)Created lambda functions and done scripting.Experience with Java, Spring Boot, Microservices, KafkaMandatory Skills:Java Spring Boot, Microservices, Kafka, Lambda, DynamoDB, SQS, SNS, S3, ECS, EC2, AWS Services
View all details

Urgent Requirement For Java Developer

Prasoon Technologies Pvt Ltd

  • 3 - 6 yrs
  • 14.0 Lac/Yr
  • Goa
Java J2EE Micorservices Restful Services Messaging Tools Like Kafka Design Patterns OOPs Concept Multithreading
Qualification: Bachelor's degree in IT, Computer Science, BCA, or MCA (mandatory)Job -2 *Java Developer:* Experience: 3-6 years relevant experience in below skills Skills: Java, J2EE, Micorservices, Restful services, messaging tools like Kafka, Design patterns, OOPs concept, Multi threading Budget: 14 LPA (Negotiable for exceptional candidates) Location: Goa for the first 3-6 months, followed by a remote work setup. For candidates not based in Goa, remote work is possible after the initial 3-6 months. Qualification: Bachelor's degree in IT, Computer Science, BCA, or MCA (mandatory)
View all details
  • 5 - 11 yrs
  • Bangalore
Core Java Core Java Developer AWS Kafka Microservices Multithreading Spring Boot Java 8 Advance Java
Greetings from Netsach - A Cyber Security Company.Job Description:We are seeking a highly skilled and motivated Senior Java Delivery to join our dynamic team. The ideal candidate will have a strong background in Core Java and Spring Boot, with a passion for developing high-quality software solutions. This role requires a keen learner who is punctual and an excellent team player. Mandatory skill set Core Java, Springboot, Kafka, AWS, Multi-threading, Micro-services & advance Java 8.Job Title: Java DeveloperExp: 4 to 6+ yrsLocation: Bangalore OnsiteFull time - F2F interview required at our Koramangala office.Interested candidate who are suitable as per the job description emily@netsach.co.in and post in netsachglobal.comNeed 4 to 6 years overall experience.Should have good communication.Mandatory skills as below in JD.JD :Worked onAWS, Kafka, Microservices, Multi-threading , Core, and advance Java 8. Design, develop, and maintain high-performance Java applications.Implement and manage Spring Boot-based microservices.Good basics on GIT.Agile methodologies and tools like ADO/JIRA.Experience with Java Multithreading.Familiarity with JPA or Hibernate.Exposure to cloud technologies such as AWS, Azure, or GCP.Any SQL.Should be hands on in all the above technology.Thank YouEmily Jhaemily@netsach.co.inNetsach - A Cyber Security CompanyNetsachglobal.com
View all details
Core Java Kafka AWS Java Java 8
Job Description :Must have 2+ years experience in Kafka and AWSWorked on AWS, Kafka, Microservices, Multi-threading , Core, and advance Java 8. Design, develop, and maintain high-performance Java applications.Implement and manage Spring Boot-based microservices. Good basics on GIT. Agile methodologies and tools like ADO/JIRA.Experience with Java Multithreading.Familiarity with JPA or Hibernate.Exposure to cloud technologies such as AWS, Azure, or GCP. Any SQL.
View all details
  • 5 yrs
  • 1.5 Lac/Yr
  • Mohali
React JS Kafka Java Script Developer Python Postgre SQL MongoDB NodeJS Springboot Express.js CSS Cascading Style Sheets Azure Gitflow AWS Kubernetes Shell Scripting FullStack
We are looking for a Full Stack Developer with expertise in React, Node.js, PostgreSQL, and AWS to enhance our TMS platform. The ideal candidate should have experience in logistics software, API integrations, and scalable architectures. Candidates with Team handling experience are preferred.Key Responsibilities1. Front-End DevelopmentDevelop a modern user-friendly interface using React.Implement Redux for state management and RTK for making HTTP requests.Design clean and efficient UI using Material-UI components.Optimize performance using Vite for module bundling and fast builds.Integrate Google Maps API and HERE Maps API for real-time tracking and geolocation services.2. Back-End DevelopmentDevelop and maintain APIs using Node.js with Express.Implement JWT-based authentication for secure user access.Build and maintain a RESTful API for front-end and third-party integrations.Optimize performance for real-time dispatching, load tracking, and vehicle management.3. Database ManagementUse PostgreSQL for structured relational data storage.Use MongoDB as a NoSQL alternative where needed.Ensure database performance, security, and scalability.4. Cloud Infrastructure & DeploymentDeploy and manage services on AWS (EC2 for hosting, S3 for storage, RDS for database management).Optimize server performance and cloud costs.Implement scalable and secure cloud-based solutions.5. Security & ComplianceEnsure data security and role-based access control (RBAC).Maintain session timeout mechanisms for inactive users.Implement logging and audit trails for user activities.Required Skills & Qualifications 5+ years of full-stack development experience (preferably in logistics or SaaS). Expertise in React, Redux, Material-UI, RTK, and Vite. Strong experience in Node.js with Express for backend development. Hands-on experience with PostgreSQL and MongoDB. Experience integrating Google Maps API and HERE Maps API. Cloud expertise in AWS (EC2, S3, RDS).
View all details
  • 7 - 10 yrs
  • 13.0 Lac/Yr
  • Vadodara
Kafka Agile SQL Kubernetes
Job Title: Backend Java Developer (Remote) Type: Remote (Vadodara)Work Shift: US Time Zone Detailed Job Description: We are seeking a skilled Backend Java Developer with 8-10 years of experience in the software development life cycle. The ideal candidate will have a strong background in Java backend development and experience with search technologies such as Elasticsearch, as well as Kafka, databases, and Kubernetes. Proficiency in Spring Boot and SQL is required, and experience with Node.js is preferred. Familiarity with Agile methodologies and SAFe Agile practices is also preferred. Qualifications: - 8-10 years of experience in software development - Strong expertise in Java backend development - Experience with search technologies (Elasticsearch) - Proficiency in Kafka, databases, and Kubernetes - Experience with Spring Boot and SQL - Node.js experience preferred - Familiarity with Agile methodologies and SAFe Agile practices preferred
View all details

Urgent Requirement For Python Developer

Futuretek Ai Solutions Private Ltd

  • 8 - 14 yrs
  • Around Chennai
Python React Js Kafka Pycharm
Job Title: Python Tech Lead / Senior Developer Office Location: Multiple locations across India Interview Mode: Video Salary Range: (According to experience)Job Description:We are seeking a highly skilled and experienced Senior Python Developer with Leading experience to join our team. The ideal candidate will have a strong understanding of Python development. This role primarily focuses on enhancing existing Python codebases by creating new classes that implement established interfaces. Additionally, the candidate should understand the complex business logic and be an expert in handling complex scenarios for a given problem statement. The candidate should exhibit a willingness to learn and apply basic Foundry concepts for effective testing. Experience in handling near real-time data stream processing is a must. Experience with frameworks like Flask, Django, or tools like Kafka will be considered as a big plus.Key Responsibilities:- Candidate must play the Tech Lead role and be able to handle a large Python codebase independently.- Collaborate with the development team to enhance and extend existing Python codebases.- Create new classes that implement existing interfaces, ensuring code quality and adherence to coding standards.- Identify and address edge cases and corner scenarios in complex systems, enhancing system reliability and stability.- Continuously learn and apply basic Foundry principles to support efficient testing and validation of Python code.- Participate in code reviews, providing constructive feedback to peers and actively seeking feedback for self-improvement.- Should have exposure to building User-Defined Functions (UDFs) in Python.- Provide application support to ensure smooth and efficient functioning of Python applications.- Offer operations support to maintain and enhance system performance and reliability.Qualifications:- A minimum of 10+ years of professional experience in Python development & 6+ years of Tech Lead exper
View all details
  • 5 - 9 yrs
  • 10.0 Lac/Yr
  • Hyderabad
Micro Services API Casandra PostGres SQL Java 8 Azure Spring Boot Kafka Jenkins Docker
We are seeking a skilled Senior Java Developer with expertise in Java 8, Azure, Spring Boot, Microservices, API development, Cassandra, PostgreSQL, Kafka, and CI/CD tools such as Jenkins and Docker. The ideal candidate will be responsible for designing, developing, and maintaining complex, high-performance, and scalable software applications.
View all details
  • 8 - 11 yrs
  • Mumbai
Hadoop Cloudera Hive Impala Hbase Sqoop Flume Kafka Nifi Hadoop Architecture Hadoop Developer
Hadoop Lead RoleExperience 9+ years- Strong architectural experience with Hortonworks, Cloudera Hadoop distributions on Appliance based and on-premise clusters.- Expertise in providing technical solutions for data lakes design and data ingestion in Hadoop.- Expertise in data modelling, data governance and architecture of large size databases.- Expertise in understanding complex data models, large scale data migrations and application development.- Expertise in designing data pipelines solutions for structured, semi-structured and unstructured data in Hadoop.- Expertise in Developing solutions for batch and real time processing data processing in Hadoop.- Hands on experience in query writing using HiveQL, Impala QL and HBase commands.- Able to provide technical guidelines assistance to development team if they face any problems related to environment- Able to coordinate with cloudera team for trouble shooting, OS / network related problems with respective teams.- Proficiency in Cloudera Manager architecture, cloudera cluster environment and cloudera manager.- Proficiency in data ingestion tools like Sqoop, Flume, Kafka, Nifi(HDF), UNIX shells scripting and python- Proficiency in building data warehouse on top of Hadoop in Hive. Defining data models and data mapping from the source to enterprise model.- Proficiency in creating solutions in Spark (Scala/Pyspark) for batch and real time processing.- Proficiency in developing solutions for NoSQL databases like HBase, Cassandra and MongoDB.- Knowledge of Data Security / Governance and Data Lineage handling on Hadoop clusters.- Strong experience on SQL Server and Oracle, MySQL.- Expertise in understanding Java , REST API concepts and troubleshooting java based services.
View all details
View More Jobs