Location:
- 150 km radius around Bratislava (Slovakia), incl. Brno (CZ), Vienna (Austria), Nitra (SK)
Summary objective of the job:
Would you like to work with terabytes of Big Data at one of the leading TravelTech companies in Central Europe and create software tools used by tens to hundreds of developers in their day to day job?
Our client provides innovative TravelTech solutions for customers and businesses. Their unique online search engine allows users to combine transportation from carriers that normally do not cooperate.
Travel itineraries allow users to combine flights and ground transportation from over 800 carriers.
We are looking for a Senior Data Platform Engineer to join the team in Bratislava, Slovakia.
Responsibilities:
- Develop and automate large scale, high-performance data processing systems (batch and/or streaming) to drive business growth.
- Build scalable data platform products leveraging Airflow scheduler/executor framework.
- Work with modern cloud data stack on GCS, Pub/Sub, Kafka, BigQuery, Postgres, Looker.
- Build scalable, reliable, secure, efficient and highly performant platforms and infrastructure for a variety of analytics and business applications.
- Contribute on data pipelines tooling to implement and extend its functionality using Python such that all users of the framework benefit from it.
- Contribute to shared data engineering tooling & standards to improve the productivity and quality of output for data engineers across the company.
- Improve data quality by using & improving internal tools to automatically detect issues.
Requirements:
Must-have Skills:
- Python
- Big Data engineering in the Cloud (ideally GCP)
- Airflow or alternatives
What Do We Expect:
- 5+ years of full-time, industry experience in software development
- Strong coding skills in Python
- Broad knowledge of different types of data storage engines (relational, non-relational)
- Hands on experience with at least 2 of them – e.g. PostgreSQL, MySQL, Redshift, ElasticSearch
- Experience with orchestration tools (Airflow ideally)
- Advanced query language (SQL) knowledge
- Working with batch or real-time data processing
- Rigor in high code quality, automated testing, and other engineering best practices
- Cloud Knowledge – Google Cloud (best fit), AWS, Azure
- BS/MS in Computer Science or a related field (ideal)
Benefits
Great Team:
- The Data Intelligence Tribe currently holds 5 teams with about 30 Data specialists… so there are lots of professionals to learn from.
- This will at least double it within the next 12 months… so you’ll have a chance to onboard new joiners.
- Teams are led by both Tech leads and Product managers… so you get the best from both worlds.
- Our colleagues collaborate extensively across team boundaries and tribes… which strengthens your seniority and professional growth.
Why To Join This Team?
- Transforming and enhancing the business by making use of Data
- It feels like a startup within a scaleup company!
- Fast paced & ambitious growing company… which means a lot of data to process!
- Great team spirit and autonomy to deliver results the way you prefer.
Tech Stack:
- Airflow, BigQuery, Kafka, PostgreSQL
- Google Cloud Platform
- Python
Work Methodology:
- Agile working processes: both Scrum, Kanban and even Scrumban in place.
- Work full remotely and will continue to do so with offering people very modern offices.
- Very flexible, but also value Face-to-face collaboration as it creates great team spirit, a feeling of belonging and keeps mental health in check.
Employment Terms:
- Permanent & full time employment
- Flexible working schedule
- Start date as soon as you can
Benefits:
- Quarterly bonuses
- Stock options
- 20+5 days vacation / year
- Meal vouchers, Cafeteria program, sick days, VIP Medical Care, Multisport card
- Hardware from Apple or Microsoft based on your preferences
- Salary : €4,250/monthly (up to €51,000/yearly
f interested, please share your CV at iuliana@euroasiarecruiting.com.