Data Engineer (Snowflake, DBT), Hyderabad

AALUCKS Talent Pro
Full-timeHyderabad, Telangana, IndiaINR 800,000 - 1,500,000/year

Position: Data Engineer (Snowflake, DBT), Hyderabad

Department: Information Technology | Role: Full-time | Experience: 3 to 5 Years | Number of Positions: 2 | Location: Hyderabad

Skillset:

Snowflake, DBT (Data Build Tool), ELT, Data Modeling, DBT Cloud, Aireflow, SQL, Python, Azure Data Factory, RBAC, GDPR, Medallion Architecture, Excellent English communication skills

Job Description:

About Us

We provide companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information-driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, we are a game-changer in any operations strategy.

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and DBT and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization. 

Key Responsibilities:

  1. Design and implement scalable ELT pipelines using dbt on Snowflake, following industry accepted best practices.

  2. Build ingestion pipelines from various sources including relational databases, APIs, cloud storage and flat files into Snowflake.

  3. Implement data modelling and transformation logic to support layered architecture (e.g., staging, intermediate, and mart layers or medallion architecture) to enable reliable and reusable data assets.

  4. Leverage orchestration tools (e.g., Airflow,dbt Cloud, or Azure Data Factory) to schedule and monitor data workflows.

  5. Apply dbt best practices: modular SQL development, testing, documentation, and version control.

  6. Perform performance optimizations in dbt/Snowflake through clustering, query profiling, materialization, partitioning, and efficient SQL design.

  7. Apply CI/CD and Git-based workflows for version-controlled deployments.

  8. Contribute to growing internal knowledge base of dbt macros, conventions, and testing frameworks.

  9. Collaborate with multiple stakeholders such as data analysts, data scientists, and data architects to understand requirements and deliver clean, validated datasets.

  10. Write well-documented, maintainable code using Git for version control and CI/CD processes.

  11. Participate in Agile ceremonies including sprint planning, stand-ups, and retrospectives.

  12. Support consulting engagements through clear documentation, demos, and delivery of client-ready solutions.

Required Qualifications:

• 3 to 5 years of experience in data engineering roles, with 2+ years of hands-on experience in Snowflake and DBT.

• Experience building and deploying DBT models in a production environment.

• Expert-level SQL and strong understanding of ELT principles. Strong understanding of ELT patterns and data modelling (Kimball/Dimensional preferred).

• Familiarity with data quality and validation techniques: dbt tests, dbt docs etc.

• Experience with Git, CI/CD, and deployment workflows in a team setting

• Familiarity with orchestrating workflows using tools like dbt Cloud, Airflow, or Azure Data Factory.

Core Competencies:

o Data Engineering and ELT Development:

• Building robust and modular data pipelines using dbt.

• Writing efficient SQL for data transformation and performance tuning in Snowflake.

• Managing environments, sources, and deployment pipelines in dbt.

o Cloud Data Platform Expertise:

• Strong proficiency with Snowflake: warehouse sizing, query profiling, data loading, and performance optimization.

• Experience working with cloud storage (Azure Data Lake, AWS S3, or GCS) for ingestion and external stages.

Technical Toolset:

o Languages & Frameworks:

• Python: For data transformation, notebook development, automation.

• SQL: Strong grasp of SQL for querying and performance tuning.

Best Practices and Standards:

o Knowledge of modern data architecture concepts including layered architecture (e.g., staging → intermediate → marts, Medallion architecture).

o Familiarity with data quality, unit testing (dbt tests), and documentation (dbt docs).

Security & Governance:

o Access and Permissions:

• Understanding of access control within Snowflake (RBAC), role hierarchies, and secure data handling.

• Familiar with data privacy policies (GDPR basics), encryption at rest/in transit.

Deployment & Monitoring:

o DevOps and Automation:

• Version control using Git, experience with CI/CD practices in a data context.

• Monitoring and logging of pipeline executions, alerting on failures.

Soft Skills:

o Communication & Collaboration:

• Ability to present solutions and handle client demos/discussions.

• Work closely with onshore and offshore team of analysts, data scientists, and architects.

• Ability to document pipelines and transformations clearly.

• Basic Agile/Scrum familiarity – working in sprints and logging tasks.

• Comfort with ambiguity, competing priorities and fast-changing client environment.

Education:

o Bachelor’s or master’s degree in computer science, Data Engineering, or a related field.

o Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus.

Additional Information:

Why Join Us?

• Opportunity to work on diverse and challenging projects in a consulting environment.

• Collaborative work culture that values innovation and curiosity.

• Access to cutting-edge technologies and a focus on professional development.

• Competitive compensation and benefits package.

• Be part of a dynamic team delivering impactful data solutions. 

• This is 5 days work from office role in Hyderabad

• There are 2 rounds of interview in the process.

Required Qualification:

Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) - IT/CS/E&CE/MCA

With a fast-growing analytics, business intelligence, and automation company

Apply for this job

Resume/CV*

Click or drag file to this area to upload your Resume

Please make sure to upload a PDF

First Name*
Last Name*
Email*
Phone Number*
What is your current CTC?*
What is your expected CTC (Max budget is 15 LPA, based on your current CTC)*
What is your shortest possible notice period (Immediate to Max 15 days of Notice Period needed)*
What is your current location? *
Who referred you/how did you get to know about this opportunity?*