Posted 1mo ago

Data Engineer

@ DevCo Residential Group
Bellevue, Washington, United States
$90k-$105k/yrHybridFull Time
Responsibilities:Data integration, Data warehousing, Data quality
Requirements Summary:2+ years of experience in data engineering or analytics engineering; strong SQL; experience with data pipelines, APIs, and cloud data warehouses; familiarity with orchestration tools.
Technical Tools Mentioned:SQL, APIs, Snowflake, BigQuery, Redshift, Azure, Airflow, Prefect, dbt, Fivetran, Stitch
Save
Mark Applied
Hide Job
Report & Hide
Job Description

DevCo is seeking a full-time Data Engineer to join our team in Bellevue, WA in hybrid in-office capacityThe Data Engineer owns data ingestion, transformation, and warehouse architecture across DevCo’s operational and financial systems. This role designs and maintains scalable, production‑grade pipelines that integrate data from third‑party platforms and internal systems, ensuring high‑quality, reliable datasets for analytics, BI, and decision‑making.



About the Company

DevCo Residential Group is an integrated development and investment company focused on multi-family communities. Founded in 1994, the company and its affiliates develop, own, and manage over 14,000 affordable and market rate apartment units throughout the United States. Headquartered in Bellevue, Washington, DevCo is one of the largest providers of affordable housing in Washington State.



 



Mission:



Devco Residential Group’s mission is to develop, construct and manage high-quality multifamily housing that provides stability, fosters growth and delivers long-term value to our residents and stakeholders.



Vision:



DevCo’s vision is to be a leading developer, builder and manager of quality multifamily  housing throughout the western US.



Values:




  • Quality:                We deliver excellence in every aspect of our work.

  • Commitment:       We honor our promises with unwavering dedication.

  • Teamwork:           We achieve more together through collaboration and respect.

  • Integrity:               We uphold the highest ethical standards in all we do.



 



Pay Details: $90,000 to $105,000/year + Bonus



Schedule: Monday-Friday 8am-5pm (Hybrid in-office format)



 



Job Responsibilities





  • Data Integration & Pipeline Engineering



  • Design, build, and maintain automated data pipelines consuming data via APIs, SFTP, flat files, databases, and third‑party connectors.

  • Own integrations with Yardi, Procore, Smartsheet, Northspyre, HappyCo, and other operational platforms.

  • Implement modern ELT/ETL workflows using orchestration and transformation frameworks.


  • Data Warehouse Architecture



  • Architect and maintain the enterprise data warehouse, including schema design, partitioning strategy, indexing, and performance optimization.

  • Develop layered data models (raw, curated, analytics‑ready) that support enterprise reporting and BI.


  • Data Quality, Reliability & Observability



  • Establish data quality checks, reconciliation rules, freshness monitoring, and anomaly detection.

  • Build logging, alerting, and monitoring to ensure pipeline reliability and SLA adherence.

  • Manage scalability and performance as data volume and usage grow.


  • Documentation & Governance



  • Document data sources, pipeline logic, schemas, lineage, and ownership.

  • Support data governance standards, including security, access controls, and handling of sensitive data.


  • Collaboration



  • Partner closely with the Data Analyst to ensure the warehouse supports semantic modeling and reporting needs.

  • Collaborate with Finance, Development, Construction, Property Management, and Accounting teams to validate business logic and requirements.