Career Opportunities: Python Developer (48873)
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
About the Role
The DSP backend is the core data processing and API layer of Syneos Health's clinical data quality platform, built on Databricks, Delta Lake, and Python/FastAPI. This engineer will own the APIs, data pipelines, and integration layer that power clinical data ingestion from sources like Medidata Rave, map raw data to CDISC SDTM/ADaM standards, execute data quality rules, and serve results to two frontend applications. This is a technically complex role at the intersection of data engineering, API design, and regulated systems.
Key Responsibilities
- Design, build, and maintain FastAPI-based RESTful and async APIs consumed by both DSP frontend applications.
- Develop and maintain Databricks notebooks and Python-based Delta Lake pipelines for the Bronze/Silver/Gold medallion architecture.
- Implement incremental data ingestion from clinical EDC sources (Medidata Rave, Veeva Vault) using Databricks Auto Loader and Delta Live Tables.
- Build the SDTM/ADaM mapping layer and data quality rule execution engine within the Silver and Gold layers.
- Design multi-tenant data isolation patterns using Databricks Unity Catalog (catalog-per-sponsor, ABAC policies).
- Implement GxP-compliant audit trails, row-level change tracking, and electronic record management on Delta Lake tables.
- Integrate with Azure services: Azure Key Vault (secrets), Azure Service Bus (event streaming), Azure PostgreSQL (metadata store).
- Work with the AI/ML team to expose SDTM mapping outputs and quality scores to the AI Mapping Engine.
- Maintain >85% unit/integration test coverage using pytest; contribute to automated regression suites.
- Produce and maintain technical documentation, API contracts (OpenAPI/Swagger), and GxP-required design specifications.
- Participate in architecture reviews, performance tuning, and proactive capacity planning for growing study volumes.
Technical Skills
Required Qualifications
- 5–8 years of backend engineering experience with Python as the primary language.
- Proven experience with FastAPI or comparable async Python frameworks (Starlette, Flask) in production.
- Hands-on experience with Databricks, Apache Spark, and Delta Lake (Bronze/Silver/Gold patterns).
- Strong understanding of SQL and data modeling for analytical workloads.
- Experience designing and consuming RESTful APIs with OpenAPI/Swagger specifications.
- Familiarity with Azure cloud services (Azure Data Factory, Azure Service Bus, Azure Key Vault, Azure PostgreSQL).
- Experience in regulated environments with data traceability, audit logging, and compliance documentation requirements.
- Strong written and verbal communication skills in English.
Nice-to-have skills
Preferred / Nice-to-Have
- Experience with CDISC standards (SDTM, ADaM, CDASH) or clinical trial data management systems.
- Familiarity with GxP validation, 21 CFR Part 11, HIPAA data handling, or pharma/biotech CRO environments.
- Experience with Databricks Unity Catalog, including ABAC policies and multi-tenant catalog architecture.
- Knowledge of Delta Live Tables (DLT) for declarative pipeline development.
- Exposure to MLflow or Databricks Model Serving for integration with AI/ML components.
- Experience with event-driven architectures using Azure Service Bus or Kafka.
Qualifications
Qualifications
- 4-6 years of work experience in relevant field
- B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
-
-
- The job has been sent to
| Please provide the information below | |
|---|---|
| Job title: | |
| *Your friend’s email address: | |
| Message: Maximum character limit: 1000 | |
| *Confirm you are not a robot: | |
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
About the Role
The DSP backend is the core data processing and API layer of Syneos Health's clinical data quality platform, built on Databricks, Delta Lake, and Python/FastAPI. This engineer will own the APIs, data pipelines, and integration layer that power clinical data ingestion from sources like Medidata Rave, map raw data to CDISC SDTM/ADaM standards, execute data quality rules, and serve results to two frontend applications. This is a technically complex role at the intersection of data engineering, API design, and regulated systems.
Key Responsibilities
- Design, build, and maintain FastAPI-based RESTful and async APIs consumed by both DSP frontend applications.
- Develop and maintain Databricks notebooks and Python-based Delta Lake pipelines for the Bronze/Silver/Gold medallion architecture.
- Implement incremental data ingestion from clinical EDC sources (Medidata Rave, Veeva Vault) using Databricks Auto Loader and Delta Live Tables.
- Build the SDTM/ADaM mapping layer and data quality rule execution engine within the Silver and Gold layers.
- Design multi-tenant data isolation patterns using Databricks Unity Catalog (catalog-per-sponsor, ABAC policies).
- Implement GxP-compliant audit trails, row-level change tracking, and electronic record management on Delta Lake tables.
- Integrate with Azure services: Azure Key Vault (secrets), Azure Service Bus (event streaming), Azure PostgreSQL (metadata store).
- Work with the AI/ML team to expose SDTM mapping outputs and quality scores to the AI Mapping Engine.
- Maintain >85% unit/integration test coverage using pytest; contribute to automated regression suites.
- Produce and maintain technical documentation, API contracts (OpenAPI/Swagger), and GxP-required design specifications.
- Participate in architecture reviews, performance tuning, and proactive capacity planning for growing study volumes.
Technical Skills
Required Qualifications
- 5–8 years of backend engineering experience with Python as the primary language.
- Proven experience with FastAPI or comparable async Python frameworks (Starlette, Flask) in production.
- Hands-on experience with Databricks, Apache Spark, and Delta Lake (Bronze/Silver/Gold patterns).
- Strong understanding of SQL and data modeling for analytical workloads.
- Experience designing and consuming RESTful APIs with OpenAPI/Swagger specifications.
- Familiarity with Azure cloud services (Azure Data Factory, Azure Service Bus, Azure Key Vault, Azure PostgreSQL).
- Experience in regulated environments with data traceability, audit logging, and compliance documentation requirements.
- Strong written and verbal communication skills in English.
Nice-to-have skills
Preferred / Nice-to-Have
- Experience with CDISC standards (SDTM, ADaM, CDASH) or clinical trial data management systems.
- Familiarity with GxP validation, 21 CFR Part 11, HIPAA data handling, or pharma/biotech CRO environments.
- Experience with Databricks Unity Catalog, including ABAC policies and multi-tenant catalog architecture.
- Knowledge of Delta Live Tables (DLT) for declarative pipeline development.
- Exposure to MLflow or Databricks Model Serving for integration with AI/ML components.
- Experience with event-driven architectures using Azure Service Bus or Kafka.
Qualifications
Qualifications
- 4-6 years of work experience in relevant field
- B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
-
- The job has been sent to
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
About the Role
The DSP backend is the core data processing and API layer of Syneos Health's clinical data quality platform, built on Databricks, Delta Lake, and Python/FastAPI. This engineer will own the APIs, data pipelines, and integration layer that power clinical data ingestion from sources like Medidata Rave, map raw data to CDISC SDTM/ADaM standards, execute data quality rules, and serve results to two frontend applications. This is a technically complex role at the intersection of data engineering, API design, and regulated systems.
Key Responsibilities
- Design, build, and maintain FastAPI-based RESTful and async APIs consumed by both DSP frontend applications.
- Develop and maintain Databricks notebooks and Python-based Delta Lake pipelines for the Bronze/Silver/Gold medallion architecture.
- Implement incremental data ingestion from clinical EDC sources (Medidata Rave, Veeva Vault) using Databricks Auto Loader and Delta Live Tables.
- Build the SDTM/ADaM mapping layer and data quality rule execution engine within the Silver and Gold layers.
- Design multi-tenant data isolation patterns using Databricks Unity Catalog (catalog-per-sponsor, ABAC policies).
- Implement GxP-compliant audit trails, row-level change tracking, and electronic record management on Delta Lake tables.
- Integrate with Azure services: Azure Key Vault (secrets), Azure Service Bus (event streaming), Azure PostgreSQL (metadata store).
- Work with the AI/ML team to expose SDTM mapping outputs and quality scores to the AI Mapping Engine.
- Maintain >85% unit/integration test coverage using pytest; contribute to automated regression suites.
- Produce and maintain technical documentation, API contracts (OpenAPI/Swagger), and GxP-required design specifications.
- Participate in architecture reviews, performance tuning, and proactive capacity planning for growing study volumes.
Technical Skills
Required Qualifications
- 5–8 years of backend engineering experience with Python as the primary language.
- Proven experience with FastAPI or comparable async Python frameworks (Starlette, Flask) in production.
- Hands-on experience with Databricks, Apache Spark, and Delta Lake (Bronze/Silver/Gold patterns).
- Strong understanding of SQL and data modeling for analytical workloads.
- Experience designing and consuming RESTful APIs with OpenAPI/Swagger specifications.
- Familiarity with Azure cloud services (Azure Data Factory, Azure Service Bus, Azure Key Vault, Azure PostgreSQL).
- Experience in regulated environments with data traceability, audit logging, and compliance documentation requirements.
- Strong written and verbal communication skills in English.
Nice-to-have skills
Preferred / Nice-to-Have
- Experience with CDISC standards (SDTM, ADaM, CDASH) or clinical trial data management systems.
- Familiarity with GxP validation, 21 CFR Part 11, HIPAA data handling, or pharma/biotech CRO environments.
- Experience with Databricks Unity Catalog, including ABAC policies and multi-tenant catalog architecture.
- Knowledge of Delta Live Tables (DLT) for declarative pipeline development.
- Exposure to MLflow or Databricks Model Serving for integration with AI/ML components.
- Experience with event-driven architectures using Azure Service Bus or Kafka.
Qualifications
Qualifications
- 4-6 years of work experience in relevant field
- B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.