Posted 3w ago

Data Engineer

@ Holafly
Dublin, Leinster, Ireland
HybridFull Time
Responsibilities:Building pipelines, Developing APIs, Leading transformations
Requirements Summary:Proven data engineering experience in fast-growing environments; strong Python, FastAPI, DBT, and SQL skills; experience with GCP/BigQuery.
Technical Tools Mentioned:Python, FastAPI, DBT, SQL, GCP, BigQuery
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Company Overview

Holafly is a high-growth scale-up revolutionising how businesses and travellers connect to the internet abroad. Since 2018, we’ve empowered travellers in over 200 destinations worldwide with secure and reliable eSIM solutions. With a team of 500+ professionals across multiple countries, we are scaling globally to support travellers with seamless, unlimited data connectivity.

We’re not just connecting people—we’re enabling freedom and peace of mind, ensuring our users stay connected from the second they land, wherever their journey takes them.

The Role

As our Data Engineer, you will build and scale the high-performance pipelines and data services that turn raw connectivity data into actionable intelligence. Your work directly enables our leadership and product teams to make real-time, data-driven decisions that fuel our international growth.

Key Responsibilities

  • Build and scale robust ETL/ELT pipelines to integrate diverse data sources into a unified, high-availability ecosystem.

  • Develop and maintain high-performance APIs using FastAPI to expose data insights to internal services and stakeholders.

  • Lead data transformations by using DBT to ensure our warehouse is structured, modular, and ready for high-level analysis.

  • Optimise data structures and schemas to ensure lightning-fast query performance for analytical and operational use cases.

  • Automate data validation and cleansing processes to guarantee the highest standards of data integrity and security.

  • Drive technical innovation by implementing modern frameworks that enhance our processing capabilities and data reliability.

Qualifications

  • Proven track record in Data Engineering or a similar back-end heavy role within high-growth environments.

  • Advanced proficiency in Python, with specific experience building and deploying APIs using FastAPI.

  • Deep expertise in DBT, with a focus on building scalable and maintainable transformation layers.

  • Strong mastery of SQL for building complex data software and manipulating large datasets.

  • Solid understanding of modern data modelling and cloud-based warehousing architecture (GCP and BigQuery preferred).

Nice to Have

  • Experience with Google Dataflow or Apache Beam.

  • Background in Infrastructure as Code (IaC) or CI/CD for data workflows.

Hybrid Modality

We focus on trust and performance. We are implementing a Two phased approach for our Hybrid Modality:

Phase 1: While the teams are establishing ways of working, building trust, and developing their rhythm, we ask that you’re present in the office Three days a week. In-person proximity is key to accelerating the forming stage.

Phase 2: Once teams are performing, we move to a more relaxed hybrid model. We ask for a minimum of one day per week in the office, centred around critical ceremonies such as sprint planning, reviews, and retrospectives.

Benefits & Perks

  • 25 Days of PTO

  • 10% pension contribution

  • € 3000 for Medical/Health coverage

  • Self development budget of $ 500