Key responsibilities include but are not limited to:
- Provide technical leadership on design and architectural decisions, data platform evolution and vendor/tool selection
- Leverage expertise in data Lakehouse on Microsoft Fabric, including optimal use of OneLake, Dataflows Gen2, Pipelines and Synapse Data Engineering
- Build and maintain scalable data pipelines to ingest, transform and curate data from a variety of structured and semi-structured sources
- Implement and enforce data modelling standards, including medallion architecture, Delta Lake and dimensional modelling best practices
- Collaborate with analysts and business users to deliver well-structured, trusted datasets for self-service reporting and analysis in Power BI
- Establish data engineering practices that ensure reliability, performance, governance and security
- Monitor and tune workloads within the Microsoft Fabric platform to ensure cost-effective and efficient operations
- University degree or college diploma is required
- Minimum of three years of experience in data engineering with at least two years in a cloud-native or modern data platform environment
- Background in legal, financial or other professional services environments is preferred
- Extensive, hands-on expertise with Microsoft Fabric, including Dataflows Gen2, Pipelines, Synapse Data Engineering, Notebooks, and OneLake
- Proven experience designing Lakehouse or data warehouse architectures, including data ingestion frameworks, staging layers and semantic models
- Strong SQL and T-SQL skills and familiarity with Power Query (M) and Delta Lake formats
- Understanding of data governance, data security, lineage and metadata management practices
- Ability to lead technical decisions and set standards in the absence of a dedicated Data Architect
- Strong communication skills with the ability to collaborate across technical and non-technical teams
- Experience with Power BI datasets and semantic modelling is an asset
- Familiarity with Microsoft Purview or similar governance tools is an asset
- Working knowledge of Python, PySpark, or KQL is an asset
How to Apply
To apply for this position, please submit your application along with a cover letter and résumé directly to our application portal.Blakes wishes to thank all applicants for their interest. However, only those candidates selected for an interview will be contacted.