Position Overview
We are seeking a highly skilled Data Engineer with strong experience in Snowflake, dbt, and Python, along with hands-on exposure to modern AI-assisted development tools that accelerate engineering workflows.
The ideal candidate will have a strong understanding of data warehousing concepts, scalable data pipeline development, and business-driven data modeling. Familiarity with Data Vault 2.0 methodologies will be an added advantage.
This role requires someone who can design, build, and optimize scalable data pipelines while collaborating closely with analytics, product, and AI-driven teams.
Key Responsibilities
- Design, build, and maintain scalable ETL/ELT pipelines using Snowflake, dbt, and Python.
- Develop and optimize data models aligned with business requirements and best practices.
- Translate business concepts into robust and scalable data solutions.
- Implement efficient Snowflake architectures, including performance tuning and cost optimization.
- Leverage modern AI tools (e.g., Copilot, AI coding assistants, data quality AI tools) to enhance development speed and reliability.
- Collaborate with engineering, analytics, and product teams to ensure delivery of high-quality and trustworthy data.
- Contribute to the design and evolution of the data warehouse and data platform architecture.
- Apply Data Vault 2.0 methodologies in data modeling and warehouse development (added advantage).
- Implement and maintain CI/CD processes for dbt and data pipelines.
- Ensure strong data governance, security, and data quality practices across all workflows.
Required Skills & Experience
- 3–10+ years of experience as a Data Engineer (flexible depending on level).
Core Technical Skills
Strong hands-on expertise in:
- Snowflake
- Data warehousing
- Snowpipe
- Streams & Tasks
- Performance tuning and optimization
- Python
- Data processing
- Automation
- Pipeline orchestration
Data Engineering Fundamentals
- Dimensional data modeling
- ETL / ELT design and best practices
- Data warehouse architecture
- Performance optimization
- Data governance and data quality
Development Practices
- Experience using AI-assisted development tools (GitHub Copilot, LLM-based coding tools, AI test generation, etc.)
- Strong experience with Git and modern development workflows
- Ability to translate business requirements into scalable technical solutions
- Strong communication, collaboration, and problem-solving skills