For a decade and a half, TIQRI has been creating breakthrough software for leading organisations in Europe, USA, Australia, Singapore and more. Parented by a Norwegian company and with significant operations in Sri Lanka, we are a team of global experts who architect solutions that change the way business is done and add colour to everyday living. As a member of our team you will have the opportunity to be a part of a diverse, global team with shared values of Transparency, Agility, Trust and Commitment. At TIQRI our behaviour fosters disruptive innovation as we challenge the status quo and continually reach greater heights.
As a Senior Data Engineer, you will be part of a dynamic team dedicated to developing cutting-edge software for an organization committed to identifying, empowering, and supporting equity-driven leaders with a mission to transform education and create opportunities for all children, with a core focus on reducing educational inequity.
Key Responsibilities:
- Data Pipeline Development:Design, develop, and optimize ETL/ELT pipelines to extract, transform, and load data from multiple sources. Ensure data pipelines are scalable, reliable, and maintainable.
- Process Automation:Identify and automate manual processes to improve efficiency and reduce operational overhead. Implement solutions to eliminate repetitive tasks using modern tools and scripting languages like Python.
- Data Architecture:Design and implement scalable data architectures to meet business and analytical requirements. Collaborate with stakeholders to build data models for data warehousing and analytics.
- Cloud Integration:Manage data storage solutions using Azure services such as Azure Blob Storage, Azure Data Lake, and Azure SQL. Utilize Azure Data Factory for orchestrating and automating data workflows.
- API Integrations:Develop and manage APIs for integrating various data sources, enabling seamless data sharing and retrieval.
- Data Governance:Implement and maintain data governance frameworks to ensure data quality, security, and compliance. Establish processes for data stewardship and enforce best practices across the organization.
- Business Intelligence Enablement:Work with tools like Power BI and integrate data for reporting and visualization. Collaborate with analysts to provide insights that drive decision-making.
- Collaboration and CI/CD:Deploy data solutions using CI/CD pipelines with tools like Azure DevOps or GitHub Actions. Establish best practices for development, testing, and deployment workflows.
- Data Pipeline Architecture:Design and implement enterprise-level data pipeline architectures to meet business and analytical needs. Optimize workflows for performance, scalability, and reliability.
- Monitoring and Optimization:
- Monitor the performance of data pipelines and optimize them for cost-efficiency and performance.
- Troubleshoot and resolve data-related issues promptly.
Qualifications:
Required Skills:
- Proficiency in Python for data processing, automation, and scripting.
- Expertise in Azure Data Services: Azure Blob Storage, Azure Data Lake, Azure Data Factory, and Azure Key Vault, Azure Service Bus.
- Strong experience with SQL for querying and data modeling.
- Proven ability to automate manual processes and implement scalable solutions.
- Knowledge of Data Governance frameworks and best practices.
- Understanding of CI/CD pipelines and tools like Azure DevOps or GitHub Actions.
- Experience with ETL/ELT processes and tools.
- Solid understanding of data warehousing and data modeling concepts.
- Proven experience in designing Data Pipeline Architectures for enterprise-level projects.
- Experience with API integrations and data exchange systems.
- Hands-on experience with Power BI for reporting and insights.
Good-to-Have Skills:
- Familiarity with Microsoft Fabric for data workflows.
- Experience with Google Analytics for extracting and analyzing web and app data.