job description – senior kafka engineer
position: senior kafka engineer
experience: 7+ years
location: bangalore / pune
open positions: 1
about the role
we are seeking a highly skilled senior kafka engineer with deep expertise in apache kafka to design, implement, and support high-performance, scalable, and reliable messaging/streaming solutions. the ideal candidate will have extensive experience in owning end-to-end kafka solutions, troubleshooting complex issues, and collaborating with cross-functional teams to ensure seamless data streaming architectures.
key responsibilities
design & architecture:
architect, implement, and optimize large-scale kafka solutions.
define and implement best practices for kafka cluster design, monitoring, and security.
work on high-throughput and low-latency streaming systems.
administration & operations:
install, configure, and maintain kafka clusters in production and non-production environments.
manage kafka brokers, zookeepers (or kraft mode), schema registry, and connectors.
perform capacity planning, performance tuning, and monitoring of clusters.
troubleshooting & support:
troubleshoot and resolve production issues related to kafka.
implement proactive monitoring, alerting, and recovery strategies.
provide 24x7 support for business-critical streaming workloads (as required).
collaboration & development:
partner with application developers, devops, and data engineering teams to integrate kafka into enterprise solutions.
develop and manage kafka topics, producers, consumers, and streaming pipelines.
create technical documentation and provide knowledge transfer to teams.
required skills & experience
core expertise:
7+ years of hands-on experience with apache kafka in production environments.
strong expertise in kafka design, cluster administration, security, monitoring, and troubleshooting.
proven ability to design scalable, high-availability, and fault-tolerant kafka architectures.
technical skills:
strong knowledge of kafka connect, kafka streams, schema registry, and rest proxy.
hands-on experience with zookeeper / kraft mode and cluster migrations.
familiarity with devops tools for deployment and monitoring (prometheus, grafana, confluent control center, etc.).
proficiency in java / scala / python for developing kafka producers and consumers.
knowledge of cloud platforms (aws/ms azure/gcp) and container orchestration (kubernetes, docker) is a plus.
soft skills:
excellent problem-solving and troubleshooting skills.
strong communication and stakeholder management abilities.
ability to work independently and collaboratively in fast-paced environments.
good to have
experience with confluent kafka platform.
knowledge of real-time data pipelines, event-driven architecture, and microservices integration.
exposure to nosql databases, big data ecosystems (hadoop, spark, flink).