Do you have a passion for building Data Solutions? Do you want the opportunity to be an integral part of building a unique AI product that will process over 100TB per day? Does furthering your knowledge with the latest Ecosystems like Hadoop and Spark interest you?
If the answer to the above is YES then Eurobase would like to hear from you!
Our Berlin based client is looking to further develop their one of a kind A.I product and due to a huge increase in customer base and data they are looking for experienced Data Engineers to help build data solutions to process over 100TB's of data per day.
- Build a generic environment for our advertisement solution.
- Build a big data solution to manage and process tens of TB a day.
- Build an unified way to work with many API´s in parallel.
- Build a high throughput product to handle tens of billions events a day.
- At least 4 years of experience with a deep understanding in Core Java and/or Scala, especially in Concurrency/Multi-threaded programming (native, Actor model, Akka)
- Can adjust to a fast dynamic startup environment and culture
- Knowledge of distributed in-memory architecture: using cache (Memcached, Redis, EHCache etc.)
- Experience with big data solutions (Hadoop, Spark, parquet) and large messaging infrastructures (Kafka, RabbitMQ)
- Experience with NOSQL solutions (Casandra, druid, Impala, presto)
- Smart, interdisciplinary, fast learner and teamplayer
- Knowledge of SQL databases and capable of comfortably crafting queries
- Strong experience with Java server development and cloud services (Amazon EC2)