Apply for Data Engineer

Employment Information

Job ID: # 29639

Experience: 3-10 years

Work Mode: Hybrid

Job Type: Part Time

Location: Chennai

Description

Position Overview

We are seeking a highly skilled Data Engineer with strong experience in Snowflake, dbt, and Python, along with hands-on exposure to modern AI-assisted development tools that accelerate engineering workflows.

The ideal candidate will have a strong understanding of data warehousing concepts, scalable data pipeline development, and business-driven data modeling. Familiarity with Data Vault 2.0 methodologies will be an added advantage.

This role requires someone who can design, build, and optimize scalable data pipelines while collaborating closely with analytics, product, and AI-driven teams.


Key Responsibilities

  • Design, build, and maintain scalable ETL/ELT pipelines using Snowflake, dbt, and Python.
  • Develop and optimize data models aligned with business requirements and best practices.
  • Translate business concepts into robust and scalable data solutions.
  • Implement efficient Snowflake architectures, including performance tuning and cost optimization.
  • Leverage modern AI tools (e.g., Copilot, AI coding assistants, data quality AI tools) to enhance development speed and reliability.
  • Collaborate with engineering, analytics, and product teams to ensure delivery of high-quality and trustworthy data.
  • Contribute to the design and evolution of the data warehouse and data platform architecture.
  • Apply Data Vault 2.0 methodologies in data modeling and warehouse development (added advantage).
  • Implement and maintain CI/CD processes for dbt and data pipelines.
  • Ensure strong data governance, security, and data quality practices across all workflows.

Required Skills & Experience

  • 3–10+ years of experience as a Data Engineer (flexible depending on level).

Core Technical Skills

Strong hands-on expertise in:

  • Snowflake
    • Data warehousing
    • Snowpipe
    • Streams & Tasks
    • Performance tuning and optimization
  • Python
    • Data processing
    • Automation
    • Pipeline orchestration

Data Engineering Fundamentals

  • Dimensional data modeling
  • ETL / ELT design and best practices
  • Data warehouse architecture
  • Performance optimization
  • Data governance and data quality

Development Practices

  • Experience using AI-assisted development tools (GitHub Copilot, LLM-based coding tools, AI test generation, etc.)
  • Strong experience with Git and modern development workflows
  • Ability to translate business requirements into scalable technical solutions
  • Strong communication, collaboration, and problem-solving skills

Codincity

India - Bengaluru

H-206 Ground Floor, Hustlehub Tech Park, 36/5, Somasundarapalya, HSR layout, Bangalore 560102

India - Coimbatore

Viya workspace, #17/1, Stark Towers, Kamarajar Nagar, Kalapatti, Coimbatore – 641 014, Tamil Nadu.

India - Chennai

Centerpoint 3, 2/4 Mount Pollamallee High Road, Next to DLF Cybercity, Ramapuram, Chennai - 600089, Tamilnadu.

USA

PMB 1555, 10900 Research Blvd Ste 160C, Austin, TX USA 78759

Australia

77 Mort Street, Blacktown, NSW 2148, Australia

Apply for Job
*Only PDF files are allowed (max upload size: 5MB)