Data Engineer (Snowflake, DBT/Matillion), Ahmedabad
AALUCKS Talent Pro
Full-time
Ahmedabad, Gujarat, IndiaINR 1,200,000 - 1,600,000/yearPosition: Data Engineer (Snowflake, DBT/ Matillion), Ahmedabad
Department: Information Technology | Role: Full-time | Experience: 3 to 5 Years | Number of Positions: 2 | Location: Ahmedabad
Skillset:
Snowflake, DBT/Matillion, Azure Cloud, Data Factory, Data Bricks, Data Warehousing, SQL, ETL/ELT, SSIS, Cloud Storage, Excellent English communication skills
Job Description:
About Us
We provide companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, we are a game-changer in any operations strategy.
We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and DBT or Matillion and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization.
Key Responsibilities:
- Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.
- Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.
- Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.
- Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.
- Apply dbt best practices: modular SQL development, testing, documentation, and version control.
- Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.
- Apply CI/CD and Git-based workflows for version-controlled deployments.
- Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.
- Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.
- Write well-documented, maintainable code using Git for version control and CI/CD processes.
- Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.
- Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.
Required Qualifications:
• 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT or Matillion (Matillion-DPC is highly preferred, not mandatory
• Experience building and deploying DBT models in a production environment.
• Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).
• Familiarity with data quality and validation techniques: dbt tests, dbt docs etc.
• Experience with Git, CI/CD, and deployment workflows in a team setting
• Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.
Core Competencies:
o Data Engineering and ELT Development:
• Building robust and modular data pipelines using dbt.
• Writing efficient SQL for data transformation and performance tuning in Snowflake.
• Managing environments, sources, and deployment pipelines in dbt.
o Cloud Data Platform Expertise:
• Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization.
• Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.
Technical Toolset:
o Languages & Frameworks:
• Python: For data transformation, notebook development, automation.
• SQL: Strong grasp of SQL for querying and performance tuning.
Best Practices and Standards:
o Knowledge of modern data architecture concepts including layered architecture (e.g., staging → intermediate → marts, Matillion architecture).
Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).
Security & Governance:
o Access and Permissions:
• Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling.
• Familiar with data privacy policies (GDPR basics), encryption at rest/in transit.
Deployment & Monitoring:
o DevOps and Automation:
• Version control using Git, experience with CI/CD practices in a data context.
• Monitoring and logging of pipeline executions, alerting on failures.
Soft Skills:
o Communication & Collaboration:
• Ability to present solutions and handle client demos/discussions.
• Work closely with onshore and offshore team of analysts, data scientists, and architects.
• Ability to document pipelines and transformations clearly.
• Basic Agile/Scrum familiarity – working in sprints and logging tasks.
• Comfort with ambiguity, competing priorities and fast-changing client environment.
Education:
o Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.
o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.
Please note the mandatory or most preferred skill set for this role
• Must have experience in Snowflake
• Must have experience in DBT or Matillion (Matillion-DPC is highly preferred)
• Must have experience in SSIS
Additional Information:
Why Join Us?
• Opportunity to work on diverse and challenging projects in a consulting environment.
• Collaborative work culture that values innovation and curiosity.
• Access to cutting-edge technologies and a focus on professional development.
• Competitive compensation and benefits package.
• Be part of a dynamic team delivering impactful data solutions.
• This is 5 days work from office role in Ahmedabad
• There are 2 rounds of interview in the process.
Required Qualification:
Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) - IT/CS/E&CE/MCA
With a fast-growing analytics, business intelligence, IT Products and automation company