Posted 2w ago

Specialist - Software Engineering (Mexico, City, MEX, MX)

@ LTIMindtree
Mexico City, Mexico, Mexico
HybridFull Time
Responsibilities:Design pipelines, Migrate pipelines, Integrate Orion
Requirements Summary:7-9 years of data engineering experience with GCP/BigQuery; Python/SQL; Airflow/Composer; GitHub/CI/CD; Adobe data platforms and Orion datasets; multi-cloud/hybrid environments.
Technical Tools Mentioned:Google Cloud Platform, BigQuery, Python, SQL, Apache Airflow, Cloud Composer, GitHub, CI/CD, Adobe data platforms, Orion data, AWS, Snowflake
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Role: Data Engineer – GCP / BigQuery


 


Role Summary


We are seeking a highly skilled Data Engineer with strong experience in Google Cloud Platform (GCP), particularly BigQuery, to design, build, and maintain scalable data pipelines. The ideal candidate will have hands-on experience with Python, SQL, Airflow/Composer, and CI/CD practices, and will play a key role in expanding and migrating Adobe-based data pipelines while integrating and operationalizing Orion datasets across hybrid cloud environments.


 


Required Skills and Qualifications


 


Experience: 7 to 9 years of applicable engineering experience


 



  • Strong hands-on experience with Google Cloud Platform (GCP), particularly BigQuery.

  • Proficiency in Python and SQL for data engineering use cases.

  • Experience with Apache Airflow and Cloud Composer.

  • Hands-on experience with GitHub and CI/CD pipelines.

  • Solid understanding of data warehousing concepts, including:

  • Star schemas

  • Dimensional modeling

  • OLTP vs. OLAP architectures

  • Experience designing and supporting ETL/ELT pipelines in large-scale data environments.

  • Familiarity with Adobe data platforms and event-based data pipelines.

  • Experience working in multi-cloud and hybrid environments (AWS, GCP, Snowflake).

  • Strong problem-solving skills and attention to data quality and performance.


 


Key Responsibilities



  • Design, build, and maintain ETL/ELT data pipelines using Python, SQL, Airflow, and GCP Composer.

  • Support the migration and expansion of Adobe-based data pipelines.

  • Integrate Orion datasets into existing enterprise datasets and data platforms.

  • Develop new data pipelines for Orion event data collection.

  • Implement and manage data ingestion, transformation, and orchestration workflows in BigQuery.

  • Optimize pipeline performance, including runtime, compute usage, storage tiering, and query costs, with a strong focus on BigQuery cost optimization.

  • Establish data quality checks, validation logic, and reconciliation processes to ensure data accuracy and reliability.

  • Work across hybrid data environments, supporting data movement and transformation across AWS, GCP, and Snowflake.

  • Implement and maintain CI/CD pipelines using GitHub and enterprise DevOps practices.

  • Create and maintain clear, comprehensive documentation for all data pipelines, integrations, and operational processes.

  • Collaborate with cross-functional teams to support analytics, reporting, and downstream data consumers.


 


Languages



  • Fluent English (mandatory)