Posted 6mo ago

Data Engineer (Python, JavaScript, Pentaho) - TS/SCI Poly

@ Leading Path Consulting
Chantilly, Virginia, United States
OnsiteFull Time
Responsibilities:design backend, build pipelines, collaborate agile
Requirements Summary:TS/SCI w/ FS Poly; Pentaho ETL; Python; PySpark; JavaScript; requirement gathering; integrating new tech stacks.
Technical Tools Mentioned:Pentaho, Python, PySpark, JavaScript, Snowflake, Databricks
Save
Mark Applied
Hide Job
Report & Hide
Job Description

We’re looking for a Data/Software Engineer who is passionate about building modern, scalable solutions for ingesting and transforming data. This role blends back-end engineering with data pipeline development and is perfect for someone who enjoys designing modular services and bringing structure to complex data environments.

As part of our Agile team, you’ll design and develop software products and services that efficiently ingest, process, and manage data from a variety of sources. You’ll play a key role in building robust, reusable APIs and data pipelines that support critical operational and analytical systems.

KEY RESPONSIBILITIES

 Design and develop scalable backend services and data ingestion solutions.

 Perform data modeling, data mapping, and large-scale file manipulation.

 Collaborate across disciplines in an Agile environment with minimal supervision.

 Drive innovation and process improvement with a hands-on development approach.

 Optimize application for maximum speed and scalability.

Required Skills:

1. Demonstrated experience with Pentaho (other ETL tools may substitute if strong experience across all other listed required skills)

2. Demonstrated experience identifying and validating requirements for Extract, Transform, and Load systems.
3. Demonstrated experience in Python development.

4. Demonstrated experience with PySpark

4. Demonstrated experience with JavaScript
4. Demonstrated experience in integrating new technology stacks into software systems.

Desired Skills:

1. Demonstrated experience developing custom components with Pentaho
2. Demonstrated experience with Snowflake
3. Demonstrated experience with Databricks