Location: Mumbai / Bangalore /
Hyderabad / Gurgaon
Level: Experienced
Associate/ Senior Associate
Required
Experience: 3 – 8
years
We’re looking for a skilled Data Engineer with Microsoft
Fabric experience to join our growing data and AI team. In this role, you will
design and build modern data platforms leveraging Microsoft Fabric, enabling
scalable analytics, AI-driven insights, and enterprise-grade data solutions for
global clients.
This is an excellent opportunity to work on next-generation
data architectures, contribute to AI-driven transformation programs, and grow
into advanced data engineering and platform leadership roles.
Responsibilities
- Design, build, and maintain scalable data
pipelines using Microsoft Fabric components such as Data Factory (Pipelines),
Dataflows Gen2, and Fabric Notebooks. - Develop and manage Lakehouse architectures
leveraging OneLake and Fabric Lakehouse for structured and semi-structured
data. - Build and optimize batch and real-time data
ingestion pipelines from multiple enterprise data sources including APIs,
streaming platforms, and enterprise systems. - Implement ETL/ELT workflows ensuring data
quality, reliability, and performance optimization. - Collaborate with data scientists, analysts, and
business stakeholders to deliver analytics-ready datasets. - Integrate Microsoft Fabric with Azure ecosystem
tools such as Azure Data Lake, Power BI, and Synapse components. - Implement observability and monitoring using
Azure Log Analytics, alerts, and action groups to ensure system reliability. - Work with governance and metadata management
tools (e.g., Fabric Catalog, Purview) to ensure data discover-ability, lineage,
and compliance. - Optimize query performance and storage
strategies for large-scale datasets. - Support deployment automation using CI/CD
pipelines and DataOps practices. - Stay updated with evolving Microsoft Fabric
capabilities and modern data engineering trends.
Requirements
Qualifications
- Bachelor’s/Master’s degree in Computer Science,
Engineering, or related field (or equivalent practical experience). - 2-4 years hands-on experience working with
Microsoft Fabric (Lakehouse, Data Factory, Pipelines, Warehouse, Dataflows Gen2). - Strong SQL and data modelling expertise.
- Experience building ETL/ELT pipelines using
modern data engineering practices. - Working knowledge of Python or PySpark for data
transformation. - Experience with Azure data services such as
Azure Data Lake Storage, Azure SQL, or Synapse Analytics. - Understanding of data warehousing concepts and
dimensional modelling. - Experience integrating Power BI with enterprise
data platforms is preferred. - Familiarity with CI/CD, version control, and
deployment automation. - Understanding of data governance, security, and
performance optimization techniques. Exposure to data governance tools such as
Purview or Fabric Catalog is a plus - Exposure to real-time ingestion frameworks such
as Eventstream or Azure Event Hubs, experience with event-driven architectures
or IoT data ingestion is a plus
Why
AuxoAI?
- Build and ship real AI and data products used by global
enterprises - Work on GenAI, LLM-powered analytics, and modern data platforms
- End-to-end ownership from architecture to production deployment
- Solve real-world problems across healthcare, CPG, BFSI, and more
- Collaborate with top-tier engineers, architects, and AI specialists
- Opportunity to shape next-generation enterprise data platforms
- Pathways into solution architecture and AI engineering roles
- High-velocity projects with measurable business outcomes
- Modern stack: Microsoft Fabric, Azure, Python, DataOps, AI platforms
- 1:1 mentorship from industry leaders in AI and data engineering
- Annual team offsites and internal tech talks
- Regular knowledge-sharing and peer-led sessions
- Competitive compensation + flexible hybrid work model