Posted 5mo ago

Data Engineer: Python, Snowflake

@ Staples
Chennai, Tamil Nadu, India
OnsiteFull Time
Responsibilities:design pipelines, optimize queries, develop tests
Requirements Summary:Bachelor's in CS/IS/Engineering or related field; 4-6 years data engineering experience; strong SQL, Snowflake, Python; experience with ETL/ELT, data warehousing, orchestration; cloud platform familiarity; excellent communication and teamwork.
Technical Tools Mentioned:Python, SQL, Snowflake, Airflow, ADF, Luigi, DBT, Kafka, Spark Streaming, Flink, Git, AWS, Azure, GCP
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Duties & Responsibilities

  • Design, develop, and maintain scalable ETL/ELT data pipelines to support business and analytics needs
  • Write, tune, and optimize complex SQL queries for data transformation, aggregation, and analysis
  • Translate business requirements into well-designed, documented, and reusable data solutions
  • Partner with analysts, data scientists, and stakeholders to deliver accurate, timely, and trusted datasets
  • Automate data workflows using orchestration/scheduling tools (Airflow, ADF, Luigi, etc.)
  • Develop unit tests, integration tests, and validation checks to ensure data accuracy and pipeline reliability
  • Document pipelines, workflows, and design decisions for knowledge sharing and operational continuity
  • Apply coding standards, version control practices, and peer code reviews to maintain high-quality deliverables
  • Proactively troubleshoot, optimize, and monitor pipelines for performance, scalability, and cost efficiency
  • Support function roll outs, including being available for post-production monitoring and issue resolution


Requirements

Basic Qualifications

  • Bachelor’s degree in computer science, Information Systems, Engineering, or a related field
  • 2–5 years of hands-on experience in data engineering and building data pipelines
  • At least 3 years of experience in writing complex SQL queries in a cloud data warehouse/ data lake environment.
  • Solid hands-on experience with data warehousing concepts and implementations
  • At least 1 year of experience with Snowflake or another modern cloud data warehouse
  • At least 1 year of hands-on Python development.
  • Familiarity on Data modeling and Data warehousing concepts
  • Experience with orchestration tools (e.g., Airflow, ADF, Luigi)
  • Familiarity with at least one cloud platform (AWS, Azure, or GCP)
  • Strong analytical, problem-solving, and communication skills
  • Ability to work both independently and as part of a collaborative team


Preferred Qualifications

  • Experience with DBT (Data Build Tool) for data transformations
  • Exposure to real-time/streaming platforms (Kafka, Spark Streaming, Flink)
  • Familiarity with CI/CD and version control (Git) in data engineering projects
  • Exposure to the e-commerce or customer data domain
  • Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team