Posted 2w ago

Specialist - Software Engineering (Mexico, City, MEX, MX)

@ LTIMindtree
Mexico City, Mexico City, Mexico
OnsiteFull Time
Responsibilities:design pipelines, migrate pipelines, integrate datasets
Requirements Summary:Design, build, and maintain ETL/ELT data pipelines on GCP; migrate Adobe-based pipelines; integrate Orion datasets; CI/CD practices; cross-cloud orchestration.
Technical Tools Mentioned:Google Cloud Platform, BigQuery, Python, SQL, Apache Airflow, Cloud Composer, GitHub, CI/CD, Adobe data platforms, Orion datasets, AWS, Snowflake
Save
Mark Applied
Hide Job
Report & Hide
Job Description

Senior Data Engineer – GCP 


 


We are seeking a highly skilled Data Engineer with strong experience in Google Cloud Platform (GCP), particularly BigQuery, to design, build, and maintain scalable data pipelines. The ideal candidate will have hands-on experience with Python, SQL, Airflow/Composer, and CI/CD practices, and will play a key role in expanding and migrating Adobe-based data pipelines while integrating and operationalizing Orion datasets across hybrid cloud environments.




Key Responsibilities



  • Design, build, and maintain ETL/ELT data pipelines using Python, SQL, Airflow, and GCP Composer.

  • Support the migration and expansion of Adobe-based data pipelines.

  • Integrate Orion datasets into existing enterprise datasets and data platforms.

  • Develop new data pipelines for Orion event data collection.

  • Implement and manage data ingestion, transformation, and orchestration workflows in BigQuery.

  • Optimize pipeline performance, including runtime, compute usage, storage tiering, and query costs, with a strong focus on BigQuery cost optimization.

  • Establish data quality checks, validation logic, and reconciliation processes to ensure data accuracy and reliability.

  • Work across hybrid data environments, supporting data movement and transformation across AWS, GCP, and Snowflake.

  • Implement and maintain CI/CD pipelines using GitHub and enterprise DevOps practices.

  • Create and maintain clear, comprehensive documentation for all data pipelines, integrations, and operational processes.

  • Collaborate with cross-functional teams to support analytics, reporting, and downstream data consumers.




Required Skills and Qualifications



  • Strong hands-on experience with Google Cloud Platform (GCP), particularly BigQuery.

  • Proficiency in Python and SQL for data engineering use cases.

  • Experience with Apache Airflow and Cloud Composer.

  • Hands-on experience with GitHub and CI/CD pipelines.

  • Solid understanding of data warehousing concepts, including:

    • Star schemas

    • Dimensional modeling

    • OLTP vs. OLAP architectures



  • Experience designing and supporting ETL/ELT pipelines in large-scale data environments.

  • Familiarity with Adobe data platforms and event-based data pipelines.

  • Experience working in multi-cloud and hybrid environments (AWS, GCP, Snowflake).

  • Strong problem-solving skills and attention to data quality and performance.




Nice to Have



  • Experience with large-scale event data and real-time/near-real-time pipelines.

  • Prior experience supporting marketing or analytics platforms (Adobe ecosystem).

  • Exposure to cost optimization strategies in cloud-based data platforms.

  • Strong documentation and stakeholder communication skills.