HomeToGo: a company built around the idea of making it easy to find the perfect accommodation for any trip. Since its founding in 2014, HomeToGo has evolved to have the world's largest selection of vacation rentals, listing millions of offers from thousands of trusted partners. HomeToGo employs more than 250 people and manages 23 local apps & websites across Europe, North America, South America, Australia and Asia-Pacific. HomeToGo also operates brands such as Tripping.com, CASAMUNDO and Wimdu.
As a Senior Data Engineer (Data Warehouse), you will join our ambitious and forward-thinking colleagues in the Data Engineering team. At HomeToGo we capture process and store hundreds of gigabytes of new data on a daily basis using technologies such as Apache Kafka and Apache Spark. Our data lake holds hundreds of terabytes of data in AWS S3 which is utilised in various ways by different teams running data jobs in Apache Airflow: from building self-service analytics dashboards via AWS Redshift, Apache Druid, Redash and Tableau to training ML models which make thousands of decisions per second on our websites every day. You will contribute to HomeToGo's data platform by developing our Data Warehouse, which has to meet challenging scalability, performance and usability requirements. Your work will be critical to ensure that everyone at HomeToGo can make data-driven decisions efficiently and reliably on a daily basis.
How You'll Add Value:
These jobs may fit you 👇
You have successfully subscribed
Check your email and follow the instructions to restore access to your account