Location: Slovak Republic,Bratislava, 150 km radius around Bratislava (Slovakia), incl. Brno (CZ), Vienna (Austria), Nitra (SK)

Summary objective of the job:

Would you like to work with Big Data (Airflow, Spark, Kafka, BigQuery) at one of the leading TravelTech companies in Central Europe and manage their Data Lake consisting of huge amounts of transactional data?

Our client provides innovative TravelTech solutions for customers and businesses. Their unique online search engine allows users to combine transportation from carriers that normally do not cooperate.

Travel itineraries allow users to combine flights and ground transportation from over 800 carriers.

We are looking for a Senior Cloud Big Data Engineer to join the team in Bratislava, Slovakia.


  • Manage the Data Lake, consisting of transactional data from our booking as well as stream of events from our frontend.
  • Write code in Python / Scala / Go to implement parts of the Data Lake, focusing on automatization to the highest extent possible, which includes implementing batch pipelines to incrementally load data as well as implementing streaming pipelines.
  • Design and implement company Data Lake from transactional and streaming data
  • Write ML algorithms and get AI to production to actually have an impact on the product
  • Identify weak spots, refactor code that needs it during development
  • Optimize code and usage of 3rd party services for speed and cost effectiveness


Must-have Skills:

  • Python
  • Big Data engineering in the Cloud (ideally GCP)
  • Airflow or alternatives

What Do We Expect:

  • 2+ years of full-time experience in a similar position
  • Strong coding skills in Python or Scala (the team uses both)
  • Broad knowledge of different types of data storage engines (relational, non-relational)
  • Hands on experience with at least 2 of them – e.g. PostgreSQL, MySQL, Redshift, ElasticSearch
  • Experience with orchestration tools (Airflow ideally)
  • Experience in Big Data processing engines such as Apache Spark, Apache Beam and its cloud runners Dataproc/Dataflow
  • Knowledge of ML/AI algorithms like OLS and Gradient Descent and its application from linear regression to deep neural networks
  • Advanced query language (SQL) knowledge
  • Experience with Batch and Real-time data processing
  • Cloud Knowledge (GCP is the best fit, alternatively AWS or Azure)
  • BS/MS in Computer Science or a related field (ideal)

What we offer:

Great Team:

  • The Data Intelligence Tribe currently holds 5 teams with about 30 Data specialists… so there are lots of professionals to learn from.
  • This will at least double it within the next 12 months… so you’ll have a chance to onboard new joiners.
  • Teams are led by both Tech leads and Product managers… so you get the best from both worlds.
  • Our colleagues collaborate extensively across team boundaries and tribes… which strengthens your seniority and professional growth.

Why To Join This Team?

  • Transforming and enhancing the business by making use of Data
  • It feels like a startup within a scaleup company!
  • Fast paced & ambitious growing company… which means a lot of data to process!
  • Great team spirit and autonomy to deliver results the way you prefer.

Tech Stack:

  • Airflow, BigQuery, Kafka, PostgreSQL
  • Google Cloud Platform
  • Python

Work Methodology:

  • Agile working processes: both Scrum, Kanban and even Scrumban in place.
  • Work full remotely and will continue to do so with offering people very modern offices.
  • Very flexible, but also value Face-to-face collaboration as it creates great team spirit, a feeling of belonging and keeps mental health in check.

Employment Terms:

  • Permanent & full time employment
  • Flexible working schedule
  • Start date as soon as you can


  • Quarterly bonuses
  • Stock options
  • 20+5 days vacation / year
  • Meal vouchers, Cafeteria program, sick days, VIP Medical Care, Multisport card
  • Hardware from Apple or Microsoft based on your preferences

If interested, please share your CV at iuliana@euroasiarecruiting.com.