Career Opportunities: Associate Director 1 (48098)
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
Key Responsibilities
Solutioning & Architecture:
- Design and architect end-to-end data engineering solutions leveraging AWS services (S3, Glue, EMR, Lambda, Kinesis, Redshift) and Snowflake
- Experience in implementing Data Vault is desired
- Define data lake and data warehouse architectures that support enterprise-scale analytics and business intelligence requirements
- Develop data integration strategies, ETL/ELT frameworks, and data pipelines using modern engineering practices
- Establish data governance frameworks, security protocols, and compliance standards across cloud platforms
- Evaluate emerging technologies and recommend innovative approaches to optimize data infrastructure
Delivery Management:
- Lead multiple concurrent data engineering projects from initiation through deployment and post-production support
- Manage project timelines, resources, budgets, and stakeholder expectations across complex implementations
- Coordinate with business stakeholders, data scientists, analysts, and engineering teams to ensure alignment with business objectives
- Implement Agile/Scrum methodologies and drive continuous improvement in delivery processes
- Manage risks, dependencies, and impediments to ensure on-time, high-quality deliverables
Team Leadership:
- Build, mentor, and develop a high-performing team of data engineers and technical specialists
- Provide technical guidance and code reviews to ensure best practices and quality standards
- Foster a culture of innovation, collaboration, and continuous learning within the team
- Conduct performance evaluations and support career development initiatives
Technical Skills
Technical Delivery:
- Hands-on involvement in complex technical implementations including data modeling, pipeline development, and performance optimization
- Ensure scalability, reliability, and performance of data platforms supporting petabyte-scale data processing
- Implement DataOps practices including CI/CD pipelines, automated testing, and monitoring frameworks
- Troubleshoot production issues and implement preventive measures
Required Qualifications
- Bachelor's or master’s degree in computer science, Engineering, or related technical field
- 15+ years of experience in data engineering with at least 3 years in leadership roles
- Strong expertise in AWS cloud services (S3, Glue, EMR, Lambda, Athena, Kinesis, Step Functions, Lake Formation)
- Extensive hands-on experience with Snowflake including architecture, optimization, and SnowSQL
- Proficiency in programming languages such as Python, SQL, and Scala/Java
- Experience with infrastructure-as-code tools (Terraform, CloudFormation)
- Proven track record of delivering large-scale data platform projects in enterprise environments
- Strong understanding of data warehousing concepts, dimensional modelling, and ETL/ELT patterns
- Experience with big data technologies (Spark, Hadoop, Kafka)
- Demonstrated ability to translate business requirements into technical solutions
Nice-to-have skills
Qualifications
Preferred Qualifications
- Domain experience – Wealth Management/Capital Markets
- AWS certifications (Solutions Architect, Data Analytics, or similar)
- SnowPro certifications (Core or Advanced)
- Experience with data governance tools (Collibra, Alation, AWS Glue Data Catalog)
- Knowledge of streaming data architectures and real-time processing
- Familiarity with dbt (data build tool) and modern data stack components
- Experience with DataOps and MLOps practices
- Strong presentation and stakeholder management skills
- PMP, Scrum Master, or similar project management certifications
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
-
-
- The job has been sent to
| Please provide the information below | |
|---|---|
| Job title: | |
| *Your friend’s email address: | |
| Message: Maximum character limit: 1000 | |
| *Confirm you are not a robot: | |
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
Key Responsibilities
Solutioning & Architecture:
- Design and architect end-to-end data engineering solutions leveraging AWS services (S3, Glue, EMR, Lambda, Kinesis, Redshift) and Snowflake
- Experience in implementing Data Vault is desired
- Define data lake and data warehouse architectures that support enterprise-scale analytics and business intelligence requirements
- Develop data integration strategies, ETL/ELT frameworks, and data pipelines using modern engineering practices
- Establish data governance frameworks, security protocols, and compliance standards across cloud platforms
- Evaluate emerging technologies and recommend innovative approaches to optimize data infrastructure
Delivery Management:
- Lead multiple concurrent data engineering projects from initiation through deployment and post-production support
- Manage project timelines, resources, budgets, and stakeholder expectations across complex implementations
- Coordinate with business stakeholders, data scientists, analysts, and engineering teams to ensure alignment with business objectives
- Implement Agile/Scrum methodologies and drive continuous improvement in delivery processes
- Manage risks, dependencies, and impediments to ensure on-time, high-quality deliverables
Team Leadership:
- Build, mentor, and develop a high-performing team of data engineers and technical specialists
- Provide technical guidance and code reviews to ensure best practices and quality standards
- Foster a culture of innovation, collaboration, and continuous learning within the team
- Conduct performance evaluations and support career development initiatives
Technical Skills
Technical Delivery:
- Hands-on involvement in complex technical implementations including data modeling, pipeline development, and performance optimization
- Ensure scalability, reliability, and performance of data platforms supporting petabyte-scale data processing
- Implement DataOps practices including CI/CD pipelines, automated testing, and monitoring frameworks
- Troubleshoot production issues and implement preventive measures
Required Qualifications
- Bachelor's or master’s degree in computer science, Engineering, or related technical field
- 15+ years of experience in data engineering with at least 3 years in leadership roles
- Strong expertise in AWS cloud services (S3, Glue, EMR, Lambda, Athena, Kinesis, Step Functions, Lake Formation)
- Extensive hands-on experience with Snowflake including architecture, optimization, and SnowSQL
- Proficiency in programming languages such as Python, SQL, and Scala/Java
- Experience with infrastructure-as-code tools (Terraform, CloudFormation)
- Proven track record of delivering large-scale data platform projects in enterprise environments
- Strong understanding of data warehousing concepts, dimensional modelling, and ETL/ELT patterns
- Experience with big data technologies (Spark, Hadoop, Kafka)
- Demonstrated ability to translate business requirements into technical solutions
Nice-to-have skills
Qualifications
Preferred Qualifications
- Domain experience – Wealth Management/Capital Markets
- AWS certifications (Solutions Architect, Data Analytics, or similar)
- SnowPro certifications (Core or Advanced)
- Experience with data governance tools (Collibra, Alation, AWS Glue Data Catalog)
- Knowledge of streaming data architectures and real-time processing
- Familiarity with dbt (data build tool) and modern data stack components
- Experience with DataOps and MLOps practices
- Strong presentation and stakeholder management skills
- PMP, Scrum Master, or similar project management certifications
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
-
- The job has been sent to
Company Overview
Incedo is a US-based consulting, data science and technology services firm with over 3000 people helping clients
from our six offices across US, Mexico and India. We help our clients achieve competitive advantage through
end-to-end digital transformation. Our uniqueness lies in bringing together strong engineering, data science, and
design capabilities coupled with deep domain understanding. We combine services and products to maximize
business impact for our clients in telecom, Banking, Wealth Management, product engineering and life science
& healthcare industries.
Working at Incedo will provide you an opportunity to work with industry leading client organizations, deep
technology and domain experts, and global teams. Incedo University, our learning platform, provides ample
learning opportunities starting with a structured onboarding program and carrying throughout various stages of
your career. A variety of fun activities is also an integral part of our friendly work environment. Our flexible
career paths allow you to grow into a program manager, a technical architect or a domain expert based on your
skills and interests.
Our Mission is to enable our clients to maximize business impact from technology by
- Harnessing the transformational impact of emerging technologies
- Bridging the gap between business and technology
Role Description
Key Responsibilities
Solutioning & Architecture:
- Design and architect end-to-end data engineering solutions leveraging AWS services (S3, Glue, EMR, Lambda, Kinesis, Redshift) and Snowflake
- Experience in implementing Data Vault is desired
- Define data lake and data warehouse architectures that support enterprise-scale analytics and business intelligence requirements
- Develop data integration strategies, ETL/ELT frameworks, and data pipelines using modern engineering practices
- Establish data governance frameworks, security protocols, and compliance standards across cloud platforms
- Evaluate emerging technologies and recommend innovative approaches to optimize data infrastructure
Delivery Management:
- Lead multiple concurrent data engineering projects from initiation through deployment and post-production support
- Manage project timelines, resources, budgets, and stakeholder expectations across complex implementations
- Coordinate with business stakeholders, data scientists, analysts, and engineering teams to ensure alignment with business objectives
- Implement Agile/Scrum methodologies and drive continuous improvement in delivery processes
- Manage risks, dependencies, and impediments to ensure on-time, high-quality deliverables
Team Leadership:
- Build, mentor, and develop a high-performing team of data engineers and technical specialists
- Provide technical guidance and code reviews to ensure best practices and quality standards
- Foster a culture of innovation, collaboration, and continuous learning within the team
- Conduct performance evaluations and support career development initiatives
Technical Skills
Technical Delivery:
- Hands-on involvement in complex technical implementations including data modeling, pipeline development, and performance optimization
- Ensure scalability, reliability, and performance of data platforms supporting petabyte-scale data processing
- Implement DataOps practices including CI/CD pipelines, automated testing, and monitoring frameworks
- Troubleshoot production issues and implement preventive measures
Required Qualifications
- Bachelor's or master’s degree in computer science, Engineering, or related technical field
- 15+ years of experience in data engineering with at least 3 years in leadership roles
- Strong expertise in AWS cloud services (S3, Glue, EMR, Lambda, Athena, Kinesis, Step Functions, Lake Formation)
- Extensive hands-on experience with Snowflake including architecture, optimization, and SnowSQL
- Proficiency in programming languages such as Python, SQL, and Scala/Java
- Experience with infrastructure-as-code tools (Terraform, CloudFormation)
- Proven track record of delivering large-scale data platform projects in enterprise environments
- Strong understanding of data warehousing concepts, dimensional modelling, and ETL/ELT patterns
- Experience with big data technologies (Spark, Hadoop, Kafka)
- Demonstrated ability to translate business requirements into technical solutions
Nice-to-have skills
Qualifications
Preferred Qualifications
- Domain experience – Wealth Management/Capital Markets
- AWS certifications (Solutions Architect, Data Analytics, or similar)
- SnowPro certifications (Core or Advanced)
- Experience with data governance tools (Collibra, Alation, AWS Glue Data Catalog)
- Knowledge of streaming data architectures and real-time processing
- Familiarity with dbt (data build tool) and modern data stack components
- Experience with DataOps and MLOps practices
- Strong presentation and stakeholder management skills
- PMP, Scrum Master, or similar project management certifications
Company Value
We value diversity at Incedo. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.