Posted 3mo ago

Data Engineer- Val

@ Valsoft
Beirut, Beirut, Lebanon
HybridFull Time
Responsibilities:Design ETL/ELT pipelines, Build data models, Maintain data workflows
Requirements Summary:4+ years of experience as a Data Engineer; strong SQL, Python; hands-on with Snowflake, dbt, AWS, Airflow, Fivetran/Stitch; experience with production data systems; knowledge of data warehousing, ETL/ELT, and structured/semi-structured data; experience with Power BI or Tableau.
Technical Tools Mentioned:Snowflake, dbt, AWS, Apache Airflow, Fivetran, Stitch, SQL, Python, Power BI, Tableau
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Aspire Software is looking for a Data Engineer to join our team in Lebanon.

Here is a little window into our company: Aspire Software operates and manages wholly owned software companies, providing mission-critical solutions across multiple verticals. By implementing industry best practices, Aspire delivers a time sensitive integration process, and the operation of a decentralized model has allowed it to become a hub for creating rapid growth by reinvesting in its portfolio.

About the Role :
Valsoft Corporation is seeking a skilled Data Engineer to join our Finance & Acquisition Data and Reporting team. In this role, you will design, build, and maintain scalable data pipelines and analytics infrastructure that support financial reporting, acquisition analytics, forecasting, and executive decision-making across Valsoft’s portfolio of companies.

You will work closely with Finance, M&A, Reporting, and Engineering stakeholders to deliver reliable, high-quality data solutions. The role involves owning data pipelines end-to-end, improving data quality and performance, and translating complex business requirements into well-designed data models and workflows.

This position is well suited for a data engineer with 4+ years of experience who is comfortable working with production data systems, enjoys solving data reliability and scalability challenges, and wants to make a direct impact on financial and strategic outcomes.

Key Responsibilities

  • Design, build, and maintain scalable and reliable ETL/ELT pipelines
  • Own and optimize Snowflake data models using dbt, including testing and documentation
  • Ingest and manage data from multiple sources using Fivetran and/or Stitch
  • Orchestrate, monitor, and troubleshoot workflows using Apache Airflow
  • Write high-performance, production-grade SQL and Python code
  • Implement data quality checks, monitoring, and performance optimizations
  • Build and maintain API-based integrations between applications and the data warehouse
  • Work with AWS services (S3, Lambda, IAM, API Gateway, etc.) to support data workflows
  • Partner with Finance and M&A stakeholders to deliver analytics, reporting, and forecasting solutions
  • Support BI tools such as Power BI (preferred) or Tableau
  • Contribute to best practices around version control, CI/CD, testing, and deployment
  • Mentor junior team members and contribute to improving team standards and documentation
  • Participate in architecture discussions and continuous improvement initiatives