Kafka Architect

Employment Type:Contract
Job Location:Budapest Hungary
Job Function:

Role Summary:
Kafka Architect will drive the design, implementation, adoption and operations of the real-time stream data platform for IoT based microservices.

Define strategy and roadmap of the “Stream Data Platform” based on Apache Kafka  

Establish best practices based on identified use cases and required integration patterns.  

Accelerate our adoption of the Kafka ecosystem by creating a framework for leveraging technologies such as Kafka Connect, KStreams/KSQL, Schema Registry, and other streaming-oriented technology  

Assist in building out the DevOps strategy for hosting and managing our microservices and connector infrastructure  

Explore real-time, predictive analytics and machine learning on operational and business data captured to enable new business opportunities  

Mentor existing team members by imparting expert knowledge to build a high-performing team in Kafka technology  


Job Requirement:

Expected skills:


General Messaging Middleware:
Seasoned messaging expert with extensive, well-rounded background in a diverse set of messaging middleware solutions (commercial, open source, in-house) with in-depth understanding of architectures of such solutions. Examples: Kafka, RabbitMQ
Strong fundamentals in distributed systems design and operations  
Deep understanding of different messaging paradigms (pub/sub, queuing), as well as delivery models, quality-of-service, and fault-tolerance architectures  
Knowledge of messaging protocols and associated APIs  
Applied experience with microservice architecture and the reactive model  
Strong background in integration patterns  

Established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works, along with interplay of architectural components: brokers, Zookeeper, producers/consumers, Kafka Connect, Kafka Streams  
Strong fundamentals in Kafka administration, configuration, and troubleshooting  
Knowledge of Kafka clustering, and its fault-tolerance model supporting HA and DR  
Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms  
Best practices to optimize the Kafka ecosystem based on use-case and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of QOS  
Experience with Kafka Streams / KSQL architecture and associated clustering model  
Hands-on experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters  
Experience with developing KSQL queries and best practices of using KSQL vs KStreams  
Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce, and how to support wire-format translations. Knowledge of connectors available from Confluent and the community  
Hands-on experience in designing, writing, and operationalizing new Kafka Connectors using the framework  
Strong familiarity of wire formats such as XML, JSON, Avro, CSV, etc. along with serialization/deserialization options  
Familiarity of the Schema Registry  
Development Languages  
Solid programming proficiency with Java, and best practices in development  
Passion for writing high quality, rock-solid software, including test automation (unit/integration)  
Python and/or Javascript  

Network / Security / OS:
Fundamental understanding of the TCP/IP protocol stack, as well as familiarity of network routing protocols and usage of associated network tools  
Knowledge of security protocols such as TLS, Kerberos, and OAUTH2, JWT, and how they work and integrate into Kafka  
Experience with the Linux OS, process management, network monitoring, I/O monitoring, and memory management and theory  

Cloud / DevOps:
Strong knowledge in architecture principles and design patterns of distributed systems in public clouds Experience with cloud orchestration and container technologies such as Docker and Kubernetes to manage microservices and connectors in the cloud  

Monitoring / Operational Management:
Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producer/consumer apps)  
Working knowledge of Splunk, how it integrates with Kafka, and using it effectively as a Kafka operational tool  
Project Methodologies  
Practical experience working with Agile methodologies such as Scrum and Kanban  
Experienced with writing user stories and participating in agile ceremonies  
Skilled in managing projects from inception to go-live  
Strong experience with Atlassian Jira and how to manage a board  

Soft Skills:
Motivated, self-starter with the ability to lead effectively across organizations  
Exemplary communication skills in both oral and written form  
Able to create technical presentations to convey architectural vision, while tailoring to the audience  
Demonstrated ability to think strategically about business and technical challenges occurring within enterprise  
Highly technical and analytical. Track record of thought leadership and ability to exploit opportunities for innovation  

Functional/Business Knowledge:
Analyzes, acquires, installs, modifies and supports operating systems, databases, utilities and Internet/intranet-related tools. Conducts systems programming and systems support activities, such as new or revised program language codes, processing routines and report generators. Monitors effective language codes, processing routines, hardware use and use of database management techniques. Develops and reviews operator and control instructions. Prepares and conducts system and programming tests requiring interfacing of hardware and software. Conducts programming tasks including program design, program coding, debugging and documentation. As directed, prepares feasibility studies and designs tests to determine operating characteristics of software.