Big Data Engineer

Employment Type:Contract
Job Location:Munich / Hamburg Germany
English Description:

Looking for an engineer with 6-8+ years of overall experience with at least 4 years’ experience on Big Data Tools and  Technologies  such as Spark, Kafka, Flume, Sqoop, Hive, HDFS, Mapreduce, HBase etc.

Job Requirement:

  •  Good understanding of cloud based SaaS, PaaS and IaaS solutions
  • Experience in Deployment of Big data solution on cloud AWS, Azure, Google cloud.
  • Certified on the "Associate Level" of AWS Certification program.
  • Experience in handling Hybrid BI-DWH implementations i.e. using traditional and Big data technologies
  • Experience in Big Data technologies Apache Hive, Apache Hue, Apache Flink, Apache Spark,  Apache Parquet, Apache mesos
  • Experience in Apache Kafka with Kafka on Kubernetes, Spark on Kubernetes , Kubernetes  controller /   operator, Kafka streams RocksDB, Kafka log Compaction.
  • Development experience in any of the language Java, Scala, Python with Micro Services, event  sourcing, KSQL, AWS DynamoDB, AWS RDS.
  • Strong Telecom domain experience.
  • Experience in Oracle GoldenGate for Big Data and Attunity Replicate (Change Data Capture) and  Virtual Distributed Storage (Alluxio) • Other technology / frameworks: HDFS, AVRO
  • Implementation experience of Data Retention policies, error logging and handling mechanism, Re-startibilty options in case of job failure etc.
  • Security implementation including handing customer sensitive information and awareness of EU-GDPR compliance
  • Experience in Implementation of authentication and authorization mechanism for secure data  access
  • Proficiency in German Language is must