Job description
Primary Skills - Oracle SQL, Python, PySpark, Snowflake, ReactJS
Claude (good to have for building agentic workflows)
Job Title: Full Stack Data Engineer (Python, PySpark, Snowflake, ReactJS)
About the Role
We are seeking a highly skilled and motivated Full Stack Data Engineer to join our growing team. This role combines strong data engineering expertise with frontend capabilities to build scalable data platforms and user-centric applications. You will work across the stack—from data pipelines to interactive dashboards—leveraging modern technologies and cloud-based architectures.
Key Responsibilities
Required Skills & Qualifications
Good to Have
What We Offer
Claude (good to have for building agentic workflows)
Job Title: Full Stack Data Engineer (Python, PySpark, Snowflake, ReactJS)
About the Role
We are seeking a highly skilled and motivated Full Stack Data Engineer to join our growing team. This role combines strong data engineering expertise with frontend capabilities to build scalable data platforms and user-centric applications. You will work across the stack—from data pipelines to interactive dashboards—leveraging modern technologies and cloud-based architectures.
- Design, develop, and optimize complex queries and data models using Oracle SQL.
- Build scalable data pipelines using Python and PySpark for batch and real-time processing.
- Develop and manage data warehousing solutions on Snowflake, ensuring performance, security, and cost optimization.
- Create responsive and dynamic user interfaces using ReactJS to visualize data insights.
- Collaborate with cross-functional teams including data scientists, analysts, and product managers to deliver end-to-end solutions.
- Ensure data quality, integrity, and governance across all pipelines and systems.
- Implement best practices for code quality, testing, and CI/CD.
- Monitor and troubleshoot data workflows and application performance.
- Strong proficiency in Oracle SQL, including query optimization and performance tuning.
- Hands-on experience with Python and distributed data processing using PySpark.
- Solid experience working with Snowflake (data modeling, SnowSQL, performance tuning).
- Frontend development experience using ReactJS, including state management and API integration.
- Understanding of ETL/ELT frameworks and data pipeline architecture.
- Experience with version control systems such as Git.
- Strong problem-solving skills and attention to detail.
- Experience with Claude (Anthropic) or similar LLMs for building agentic workflows and intelligent automation.
- Familiarity with cloud platforms such as AWS, Azure, or GCP.
- Knowledge of containerization tools like Docker and orchestration frameworks like Kubernetes.
- Exposure to REST APIs and microservices architecture.
- Experience with workflow orchestration tools (e.g., Airflow).
- Opportunity to work on cutting-edge data and AI-driven solutions.
- Collaborative and innovative work environment.
- Competitive compensation and benefits.
- Continuous learning and professional development opportunities.
Range of Year Experience-Min Year
8
Physical Location
Bangalore - Campus
Qualifications
B.E/B Tech
Range of Year Experience-Max Year
12