Posted 1w ago

Full-stack Data Engineer II

@ Archer
Cairo, Cairo, Egypt
RemoteFull Time
Responsibilities:Design Delta Lake tables, Develop agentic workflows, Build API backends
Requirements Summary:2–4 years experience with PySpark/SQL; strong DataBricks (Unity Catalog, DLT, Notebooks); Lakehouse architecture; hands-on LLMs; building AI agents; Python and Docker backend.
Technical Tools Mentioned:PySpark, SQL, DataBricks, Unity Catalog, DLT, Notebooks, LangChain, AutoGen, Mosaic AI, Python, Docker, FastAPI
Save
Mark Applied
Hide Job
Report & Hide
Job Description

FULL-STACK DATA ENGINEER II

Location: Cairo, Egypt

Experience: 2–4 Years

Stack: DataBricks & Agentic AI

The Role

We are looking for a forward-thinking Full-Stack Data Engineer II to join our team in Egypt. This is not a traditional ETL role; you will be at the forefront of the Data + AI era, leveraging DataBricks to build intelligent data products. You will be responsible for the end-to-end lifecycle of data, from ingestion and Lakehouse architecture to developing agentic AI workflows that allow our business systems to act autonomously based on data insights.

AI & Agentic Focus

Build and deploy LLM-based agents that utilize data tools to solve complex tasks, including implementing Retrieval-Augmented Generation (RAG) patterns and autonomous chains-of-thought for decision-making.

Key Responsibilities

  • Design and optimize Delta Lake tables within DataBricks using Unity Catalog for governance.
  • Develop and deploy agentic workflows using LangChain, AutoGen, or DataBricks Mosaic AI.
  • Build Python-based backend services (FastAPI) to expose AI-driven insights.
  • Manage data pipelines using DataBricks Workflows and Delta Live Tables (DLT).
  • Implement monitoring frameworks to ensure agent accuracy, safety, and performance.

Technical Requirements

  • 2–4 years experience with PySpark and SQL
  • Strong experience with DataBricks (Unity Catalog, DLT, Notebooks)
  • Understanding of Lakehouse architecture
  • Hands-on experience with LLMs and prompt engineering
  • Experience building AI agents using APIs and tools
  • Backend development in Python and Docker

Preferred Experience

  • MLflow for model tracking and deployment
  • Vector databases (Pinecone, Weaviate, DataBricks Vector Search)
  • Familiarity with Egyptian data localization requirements

What We Offer

  • Work on cutting-edge Agentic AI projects
  • Competitive compensation in EGP with bonuses
  • Hybrid work environment in New Cairo

Updated Version: DataBricks & Agentic AI Focused – Egypt