Kafka Engineer
Get Noticed
- Make sure Devsinc actually reads your resume
- Get AI-rewritten bullet points
- Download Gulf-ready CV
60 seconds. $3.99 one-time.
Devsinc is looking for a knowledgeable and enthusiastic Kafka Engineer to join our growing team in KSA. In this role, you will be responsible for designing, implementing, and managing Kafka-based data pipelines and stream processing systems. You will work with cross-functional teams to integrate Kafka with different systems and ensure data flow reliability and efficiency.
Requirements
Requirements:
Kafka Operations: Confluent Kafka ; cluster setup and maintenance.
Cluster Management: High availability, cross-site replication,ZooKeeper.
Schema Registry & Topic Governance: Confluent Schema Registry or equivalent.
Monitoring: Prometheus JMX Exporter, Dynatrace Kafka plugins, Elastic. Security: TLS, SASL, ACLs, RBAC.
Integration & APIs: Kafka REST API, connectors.
Key Responsibilities:
Event-driven architecture and distributed system principles.
Performance tuning and throughput optimization.
Disaster recovery strategies for Kafka clusters.
Governance, schema evolution, and topic lifecycle management.
Operate and scale Kafka clusters reliably.
Ensure HA and cross-site replication.
Implement governance, topic management, and secure access.
Integrate and monitor Kafka metrics/logs into observability dashboards.
Ensure proper ACLs and RBAC policies for producers/consumers
Respond to Kafka-related incidents and outages
Experience: Minimum 4–6 years of experience required.
Requirements
- •Kafka cluster setup and maintenance (Confluent Kafka)
- •High availability and cross-site replication management
- •ZooKeeper knowledge
- •Schema Registry or equivalent
- •Monitoring tools (Prometheus, Dynatrace, Elastic)
- •Security protocols (TLS, SASL, ACLs, RBAC)
- •Kafka REST API and connectors knowledge
- •Minimum 4-6 years of experience
Nice to Have
- •Governance, schema evolution, and topic lifecycle management
- •Integrate and monitor Kafka metrics/logs into observability dashboards
- •Ensure proper ACLs and RBAC policies for producers/consumers
- •Respond to Kafka-related incidents and outages
Responsibilities
- •Design and implement Kafka-based data pipelines
- •Manage stream processing systems
- •Integrate Kafka with different systems
- •Ensure data flow reliability and efficiency
- •Implement event-driven architecture principles
- •Optimize performance and throughput
- •Develop disaster recovery strategies
- •Operate and scale Kafka clusters reliably
Related Jobs
- Make sure Devsinc actually reads your resume
- Get AI-rewritten bullet points
- Download Gulf-ready CV
60 seconds. $3.99 one-time.