06th January, 2026
Senior Data Engineer
Location: Hybrid, Calgary
Type: Contract
At Impact Logic, we connect purpose-driven organizations with exceptional talent.
The Impact
This role exists to turn complex, high-volume data into trusted foundations for analytics and AI. You'll play a central role in enabling smarter decision-making, predictive insights, and intelligent products by building scalable, well-governed data platforms that teams can rely on.
At a time when the organization is accelerating its use of advanced analytics and AI, your work will directly influence how data is sourced, structured, and transformed into real-world impact.
About the company
Our client is a large, enterprise-scale organization operating critical assets and services. Data sits at the heart of how they improve reliability, efficiency, and long-term planning. With growing investment in cloud platforms, analytics, and AI, they are building modern data capabilities designed to support both operational excellence and innovation.
The Role
As a Senior Data Engineer, you'll design, build, and optimize data platforms on Azure that support analytics and AI use cases at scale. Working closely with Data Scientists, AI Engineers, and business stakeholders, you'll ensure data is reliable, accessible, and ready to power advanced insight and automation.
Key Responsibilities
- Design and implement Delta Lake architectures (Bronze / Silver / Gold) to support analytical and AI workloads.
- Build, orchestrate, and optimize data pipelines using Azure Data Factory.
- Develop and manage Databricks notebooks, jobs, and workflows on Azure.
- Prepare feature-ready datasets for machine learning and AI initiatives, including forecasting, computer vision, and NLP / LLM use cases.
- Work with large-scale structured and unstructured datasets, including time-series data, image metadata, and text-based sources.
- Apply strong data governance, security, and access control practices across the platform.
- Collaborate with technical and non-technical stakeholders to translate analytical requirements into scalable data solutions.
What Success Looks Like: - Delivered reliable, high-performance data pipelines that are trusted across analytics and AI teams.
- Established well-structured Delta Lake layers that support both reporting and advanced AI use cases.
- Enabled Data Scientists and AI Engineers to work faster by providing clean, feature-ready datasets.
- Improved platform performance, scalability, and maintainability through thoughtful design and optimization.
- Built strong working relationships across data, AI, and business teams.
About YouEssential Experience - 3-6 years' professional experience in data engineering, analytics engineering, or data platform development within an enterprise environment.
- Hands-on experience with Databricks on Azure, including notebooks, jobs, workflows, and performance optimisation.
- Current Databricks certification, demonstrating applied platform expertise.
- Proven experience designing and implementing Delta Lake architectures (Bronze / Silver / Gold).
- Strong experience building and orchestrating data pipelines using Azure Data Factory (ADF).
- Experience preparing datasets to support machine learning and AI initiatives.
- Confidence working with large-scale structured and unstructured data.
- Strong proficiency in Python and SQL for data transformation, validation, and tuning.
- Solid understanding of data governance, security, and access controls in Azure-based data platforms.
Nice to Have - Experience working with enterprise asset management systems, ideally IBM Maximo.
- Working knowledge of Microsoft Fabric and its integration with Databricks and Azure services.
- Experience collaborating closely with Data Scientists, AI Engineers, and business stakeholders.
You'll be joining an environment that values: - Pragmatic, outcome-focused engineering
- Collaboration across disciplines
- Thoughtful use of technology to solve real problems
- Continuous learning and improvement
Why Join? - Work on meaningful, large-scale data and AI initiatives
- Influence how modern data platforms are designed and used
- Collaborate with experienced data and AI professionals
- Competitive salary, benefits, and long-term development opportunities
The Impact Logic Process Every application is reviewed by our team, and shortlisted candidates are assessed through a structured, evidence-based interview focused on impact, capability, and ways of working. We work closely with both candidates and clients to ensure alignment, transparency, and long-term success.
How to Apply Apply with your resume. If your experience aligns, one of our consultants will be in touch to talk through the role, the organization, and next steps.
Impact Logic is committed to building diverse teams and welcomes applications from all backgrounds.
Apply For Job