Kafka/Integration Architect
Job Description – Kafka/Integration Architect
Position brief:
Kafka/Integration Architect is responsible for designing, implementing, and managing Kafka-based streaming data pipelines and messaging solutions. This role involves configuring, deploying, and monitoring Kafka clusters to ensure the high availability and scalability of data streaming services. The Kafka Architect collaborates with cross-functional teams to integrate Kafka into various applications and ensures optimal performance and reliability of the data infrastructure.
Kafka/Integration Architect play a critical role in driving data-driven decision-making and enabling real-time analytics, contributing directly to the company’s agility, operational efficiency and ability to respond quickly to market changes. Their work supports key business initiatives by ensuring that data flows seamlessly across the organization, empowering teams with timely insights and enhancing the customer experience.
Location: Hyderabad
Primary Role & Responsibilities:
- Design, implement, and manage Kafka-based data pipelines and messaging solutions to support critical business operations and enable real-time data processing.
- Configure, deploy, and maintain Kafka clusters, ensuring high availability and scalability to maximize uptime and support business growth.
- Monitor Kafka performance and troubleshoot issues to minimize downtime and ensure uninterrupted data flow, enhancing decision-making and operational efficiency.
- Collaborate with development teams to integrate Kafka into applications and services.
- Develop and maintain Kafka connectors such as JDBC, MongoDB, and S3 connectors, along with topics and schemas, to streamline data ingestion from databases, NoSQL data stores, and cloud storage, enabling faster data insights.
- Implement security measures to protect Kafka clusters and data streams, safeguarding sensitive information and maintaining regulatory compliance.
- Optimize Kafka configurations for performance, reliability, and scalability.
- Automate Kafka cluster operations using infrastructure-as-code tools like Terraform or Ansible to increase operational efficiency and reduce manual overhead.
- Provide technical support and guidance on Kafka best practices to development and operations teams, enhancing their ability to deliver reliable, high-performance applications.
- Maintain documentation of Kafka environments, configurations, and processes to ensure knowledge transfer, compliance, and smooth team collaboration.
- Stay updated with the latest Kafka features, updates, and industry best practices to continuously improve data infrastructure and stay ahead of industry trends.
- Strong analytical and problem-solving skills.
- Excellent communication and collaboration skills.
- Ability to translate business requirements into technical solutions.
- Strong knowledge of Kafka architecture, including brokers, topics, partitions and replicas.
- Experience with Kafka security, including SSL, SASL, and ACLs.
- Proficiency in configuring, deploying, and managing Kafka clusters in cloud and on-premises environments.
- Experience with Kafka stream processing using tools like Kafka Streams, KSQL, or Apache Flink.
- Solid understanding of distributed systems, data streaming and messaging patterns.
- Proficiency in Java, Scala, or Python for Kafka-related development tasks.
- Familiarity with DevOps practices, including CI/CD pipelines, monitoring, and logging.
- Experience with tools like Zookeeper, Schema Registry, and Kafka Connect.
- Strong problem-solving skills and the ability to troubleshoot complex issues in a distributed environment.
- Experience with cloud platforms like AWS, Azure, or GCP.
- Kafka certification or related credentials, such as: Confluent Certified Administrator for Apache Kafka (CCAAK)
- Cloudera Certified Administrator for Apache Kafka (CCA-131)
- AWS Certified Data Analytics – Specialty (with a focus on streaming data solutions)
- Knowledge of containerization technologies like Docker and Kubernetes.
- Familiarity with other messaging systems like RabbitMQ or Apache Pulsar.
- Experience with data serialization formats like Avro, Protobuf, or JSON.