Senior Database Engineer, Pune

AALUCKS Talent Pro
Full-timePune, Maharashtra, IndiaINR 2,700,000 - 3,700,000/year

Position: Senior Database Engineer, Pune

Department: Information Technology | Role: Full-time | Experience: 7 to 12 Years | Number of Positions: 1 | Location: Pune

Skillset:

Database Engineer, Cloud Database, AWS, Postgres, Snowflake, Oracle RDS, Medallion Architecture, DBT, AI/ML Solutions, Data Security, Python, Terraform, Database design, Excellent English communication skills

Job Description:

Mission

Join our DevOps Engineering team as a Senior Database Engineer responsible for designing, optimizing, and automating cloud database solutions across AWS RDS, Postgres, and Snowflake. This role focuses on performance engineering, data integration, and automation - ensuring our data platforms are scalable, reliable, and efficient. You’ll work closely with DevOps and Product Engineering to build high-performing data infrastructure that supports critical applications and analytics. 

Key Responsibilities:

Modern Data Architecture & Platform Engineering

• Design, build, and optimize database solutions using Snowflake, PostgreSQL, and Oracle RDS.

• Design and evolve cloud-native data lakehouse architectures using Snowflake, AWS, and open data formats where appropriate.

• Implement and manage Medallion Architecture (Bronze / Silver / Gold) patterns to support raw ingestion, curated analytics, and business-ready datasets.

• Build and optimize hybrid data platforms spanning operational databases (PostgreSQL / RDS) and analytical systems (Snowflake).

• Develop and maintain semantic layers and analytics models to enable consistent, reusable metrics across BI, analytics, and AI use cases.

• Engineer efficient data models, ETL/ELT pipelines, and query performance tuning for analytical and transactional workloads.

• Implement replication, partitioning, and data lifecycle management to enhance scalability and resilience.

• Manage schema evolution, data versioning, and change management in multienvironment deployments.

Advanced Data Pipelines & Orchestration

• Engineer highly reliable ELT pipelines using modern tooling (e.g., dbt, cloud-native services, event-driven ingestion).

• Design pipelines that support batch, micro-batch, and near–real-time processing.

• Implement data quality checks, schema enforcement, lineage, and observability across pipelines

• Optimize performance, cost, and scalability across ingestion, transformation, and consumption layers. 

AI-Enabled Data Engineering

Apply AI and ML techniques to data architecture and operations, including:

• Intelligent data quality validation and anomaly detection

• Automated schema drift detection and impact analysis

• Query optimization and workload pattern analysis

• Design data foundations that support ML feature stores, training datasets, and inference pipelines.

• Collaborate with Data Science teams to ensure data platforms are AI-ready, reproducible, and governed. 

Automation, DevOps & Infrastructure as Code

• Build and manage data infrastructure as code using Terraform and cloud-native services.

• Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.

• Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows. 

Security, Governance & Compliance

• Implement enterprise-grade data governance, including role-based access control, encryption, masking, and auditing.

• Enforce data contracts, ownership, and lifecycle management across the lakehouse.

• Partner with Security and Compliance teams to ensure audit readiness and regulatory alignment.

• Build and manage data infrastructure as code using Terraform and cloud-native services.

• Integrate data platforms into CI/CD pipelines, enabling automated testing, deployment, and rollback of data changes.

• Develop tooling and automation (Python, SQL, APIs) to streamline provisioning, monitoring, and operational workflows. 

Required Skills & Experience

• 7+ years of experience in data engineering, database engineering, or data platform development in production environments.

• Strong hands-on experience with Snowflake, including performance tuning, security, and cost optimization.

• Deep expertise with PostgreSQL and AWS RDS in cloud-native architectures.

• Proven experience designing lakehouse or modern data warehouse architectures.

• Strong understanding of Medallion Architecture, semantic layers, and analytics engineering best practices.

• Experience building and operating advanced ELT pipelines using modern tooling (e.g., dbt, orchestration frameworks).

• Proficiency with SQL and Python for data transformation, automation, and tooling.

• Experience with Terraform and infrastructure-as-code for data platforms.

• Solid understanding of data governance, observability, and reliability engineering

What Success Looks Like Within the First 90 Days:

• Fully onboarded and delivering enhancements to Snowflake and RDS environments.

• Partnering with DevOps and Product Engineering on data infrastructure improvements.

• Delivering optimized queries, schemas, and automation for key systems.

Ongoing Outcomes:

• Consistent improvement in data performance, scalability, and reliability.

• Effective automation of database provisioning and change management.

• Continuous collaboration across teams to enhance data availability and governance. 

Bonus Experience (Nice to Have):

• Experience with DBT, AWS Glue, Airflow, or similar orchestration tools.

• Familiarity with feature stores, ML pipelines, or MLOps workflows.

• Exposure to data observability platforms and cost optimization strategies.

• Relevant certifications (Snowflake SnowPro, AWS Database Specialty, etc.). 

Additional Information:

• This is Hybrid model working (3 Days work from office role)

• Interview process: 2-3 rounds in the interview process.

Required Qualification:

Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)- IT/CS/E&CE/MCA

With a Top Prduct-based IT company in Pharma-Tech Domain

Apply for this job

Resume/CV*

Click or drag file to this area to upload your Resume

Please make sure to upload a PDF

First Name*
Last Name*
Email*
Phone Number*
The hiring team may use this number to contact you about this job.
What is your current CTC?*
What is your expected CTC (Max budget is 35 LPA, based on current CTC)?*
What is your shortest possible notice period (Immediate to Max 45 days of notice period acceptable)?*
What is your current location (MUST be currently based in Pune or at least in Maharashtra)?*
Who referred you/how did you get to know about this opportunity?*

By clicking 'Submit Application', you agree to receive job application updates from AALUCKS Talent Pro via text and/or WhatsApp. Message frequency may vary. Reply STOP to unsubscribe at any time. Message & data rates may apply.