Posted 3w ago

Team Leader - Software (35945)

@ Voya Financial
Bangalore or Pune or Delhi or Hyderabad
HybridFull Time
Responsibilities:Architect pipelines, Design models, Develop transformations
Requirements Summary:Lead software/data engineering projects; strong leadership, collaboration, and hands-on technical expertise with data/ETL tools.
Technical Tools Mentioned:Azure Data Factory, Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, Azure Function Apps, Cortex AI, GitHub Copilot
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Career Opportunities: Team Leader - Software (35945)

Requisition ID 35945 - Posted  - India









































 


Primary Skills (Mandatory)(ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity.
Secondary Skills (Preferred) 
5. JOB DESCRIPTION
Company OverviewAbout Voya India:
Voya India (earlier VFISLK) started out as a joint venture between U.S. financial services company Voya Financial and SLK, a software services company and is head quartered at Bangalore. As of Aug 2023, we are a wholly owned subsidiary of Voya Financial. We are a dynamic technology & business process transformation company that provides world-class technology & business process management services, with an emphasis on quality, speed & optimization driven through automation. We support and deliver innovative solutions to Voya’s Retirement, Employee Benefits, and Investment Management businesses.

More information about us is available at: https://www.voyaindia.com
Role Summary. The engineer will work extensively with Azure Data Factory (ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity
Key Roles & Responsibilities• Architect and implement scalable data pipelines and ETL workflows using ADF, PySpark, and DBT.
• Design and optimize data models for Snowflake and other cloud-based data platforms.
• Develop and maintain complex data transformation logic using Python, PL/SQL, DBT, Snowpark, Snowsight web interface, Databricks notebooks, and Pyspark.
• Integrate job schedulers and Azure Function Apps for orchestration and automation.
• Apply Gen AI tools (Cortex AI, GitHub Copilot) to accelerate development, improve code quality, and enhance productivity.
• Ensure robust data governance, security, and compliance across all solutions.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Mentor junior engineers and provide technical leadership in data engineering best practices.
• Participate in Scrum ceremonies and contribute to continuous improvement initiatives.
6. EDUCATION & CERTIFICATIONS
Education• Bachelor of Engineering degree in Computer Science, Information Technology, or related field.
• Exposure to Investment Management or Financial Services domain preferred.
Certifications (if any)Certifications in Azure Data Engineering, Snowflake, or Databricks are a plus.
7. LOCATION & WORK ARRANGEMENT
Work LocationBangalore/Pune/Delhi/Hyderabad
If Other, specify city/location 
Work ModelHybrid

 










 




































Email this job to a friend
 
 
 
The job has been sent to
 






Please provide the information below
Job title:
*Your friend’s email address:
Message:



*Confirm you are not a robot:






Requisition ID 35945 - Posted  - India


Primary Skills (Mandatory)(ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity.
Secondary Skills (Preferred) 
5. JOB DESCRIPTION
Company OverviewAbout Voya India:
Voya India (earlier VFISLK) started out as a joint venture between U.S. financial services company Voya Financial and SLK, a software services company and is head quartered at Bangalore. As of Aug 2023, we are a wholly owned subsidiary of Voya Financial. We are a dynamic technology & business process transformation company that provides world-class technology & business process management services, with an emphasis on quality, speed & optimization driven through automation. We support and deliver innovative solutions to Voya’s Retirement, Employee Benefits, and Investment Management businesses.

More information about us is available at: https://www.voyaindia.com
Role Summary. The engineer will work extensively with Azure Data Factory (ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity
Key Roles & Responsibilities• Architect and implement scalable data pipelines and ETL workflows using ADF, PySpark, and DBT.
• Design and optimize data models for Snowflake and other cloud-based data platforms.
• Develop and maintain complex data transformation logic using Python, PL/SQL, DBT, Snowpark, Snowsight web interface, Databricks notebooks, and Pyspark.
• Integrate job schedulers and Azure Function Apps for orchestration and automation.
• Apply Gen AI tools (Cortex AI, GitHub Copilot) to accelerate development, improve code quality, and enhance productivity.
• Ensure robust data governance, security, and compliance across all solutions.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Mentor junior engineers and provide technical leadership in data engineering best practices.
• Participate in Scrum ceremonies and contribute to continuous improvement initiatives.
6. EDUCATION & CERTIFICATIONS
Education• Bachelor of Engineering degree in Computer Science, Information Technology, or related field.
• Exposure to Investment Management or Financial Services domain preferred.
Certifications (if any)Certifications in Azure Data Engineering, Snowflake, or Databricks are a plus.
7. LOCATION & WORK ARRANGEMENT
Work LocationBangalore/Pune/Delhi/Hyderabad
If Other, specify city/location 
Work ModelHybrid

 



Email this job to a friend
 
The job has been sent to
 
The job has been sent to


Primary Skills (Mandatory)(ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity.
Secondary Skills (Preferred) 
5. JOB DESCRIPTION
Company OverviewAbout Voya India:
Voya India (earlier VFISLK) started out as a joint venture between U.S. financial services company Voya Financial and SLK, a software services company and is head quartered at Bangalore. As of Aug 2023, we are a wholly owned subsidiary of Voya Financial. We are a dynamic technology & business process transformation company that provides world-class technology & business process management services, with an emphasis on quality, speed & optimization driven through automation. We support and deliver innovative solutions to Voya’s Retirement, Employee Benefits, and Investment Management businesses.

More information about us is available at: https://www.voyaindia.com
Role Summary. The engineer will work extensively with Azure Data Factory (ADF), Python, PL/SQL, DBT, PySpark, Snowflake, Databricks, and job scheduling tools, while leveraging Azure Function Apps and Gen AI tools such as Cortex AI and GitHub Copilot for automation and productivity
Key Roles & Responsibilities• Architect and implement scalable data pipelines and ETL workflows using ADF, PySpark, and DBT.
• Design and optimize data models for Snowflake and other cloud-based data platforms.
• Develop and maintain complex data transformation logic using Python, PL/SQL, DBT, Snowpark, Snowsight web interface, Databricks notebooks, and Pyspark.
• Integrate job schedulers and Azure Function Apps for orchestration and automation.
• Apply Gen AI tools (Cortex AI, GitHub Copilot) to accelerate development, improve code quality, and enhance productivity.
• Ensure robust data governance, security, and compliance across all solutions.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Mentor junior engineers and provide technical leadership in data engineering best practices.
• Participate in Scrum ceremonies and contribute to continuous improvement initiatives.
6. EDUCATION & CERTIFICATIONS
Education• Bachelor of Engineering degree in Computer Science, Information Technology, or related field.
• Exposure to Investment Management or Financial Services domain preferred.
Certifications (if any)Certifications in Azure Data Engineering, Snowflake, or Databricks are a plus.
7. LOCATION & WORK ARRANGEMENT
Work LocationBangalore/Pune/Delhi/Hyderabad
If Other, specify city/location 
Work ModelHybrid