Posted 1mo ago

Data Engineer III

@ R2Net
New York, New York, United States
$147k-$172k/yrRemoteFull Time
Responsibilities:Design pipelines, Maintain models, Collaborate with stakeholders
Requirements Summary:7+ years of data engineering experience; SQL and Python; Snowflake/BigQuery; Airflow; dbt; AWS; Tableau; ETL/ELT; CI/CD; API design; Scrum/Jira.
Technical Tools Mentioned:Airflow, dbt, Rivery, Python, SQL, Snowflake, BigQuery, Tableau, Google Analytics, AWS, ECR, Lambda, Airflow, Pulumi, Docker, Jira
Save
Mark Applied
Hide Job
Report & Hide
Job Description

The Data Engineer III is responsible for building and maintaining robust data pipelines and models for the R2Net organization, overseeing a broad scope of ETL infrastructure and databases. By deploying expertise across many diverse systems, APIs, and platforms, this role plays a key part in enabling accurate and timely access to data across the organization, with an emphasis on simplifying and centralizing a complex ecosystem – and thereby providing trustworthy, analytic-ready resources for the various business units.

This role, by interfacing with key stakeholders in Engineering, Analytics, Operations, Finance, Marketing, and Customer Service, will work to build a data environment that is accurate, complete, timely, and dependable, and will serve as a trusted partner to key associates across the org. In achieving these goals, this role will extend beyond pipeline management – it will involve deep collaboration with business stakeholders, proactive engagement in data strategy, and a focus on driving measurable impact through data. This engineer is expected to bridge technical execution with business outcomes, owning long-term initiatives that drive top-line and bottom-line growth for R2Net as a whole.

Key Responsibilities:

  • Design, implement, and maintain complex data pipelines, ensuring scalability and reliability using Airflow, dbt, Rivery, Python, and SQL, enabling robust ingestion and transformation of structured and semi-structured data.
  • Serve as a strategic partner to business teams, working closely with stakeholders to translate high-level goals into data solutions that support forecasting, performance tracking, and optimization.
  • Develop and maintain clean, well-documented data models in Snowflake and BigQuery that support analytics, reporting, and operational workflows and contribute to architecture decisions.
  • Integrate data from a variety of internal and external sources, including Google Analytics and third-party APIs, to support full-funnel visibility across departments.
  • Enable self-service analytics by ensuring data assets are discoverable and usable via tools such as Tableau, including thoughtful semantic layer design and performance tuning.
  • Contribute to the development of robust monitoring and observability practices for data quality and pipeline health.
  • Collaborate on architecture and design decisions, including cloud infrastructure and containerization using AWS. Pulumi and Docker.
  • Maintain strong documentation and promote engineering standards that ensure transparency, maintainability, and reusability of data systems.