Job title: Lead Data Engineer
Job type: Contract
Emp type: Full-time
Pay interval: Hourly
Pay rate: negotiable
Job published: 23/03/2026

Job Description

Job Title: Lead Data Engineer

Security Clearance: Negative Vetting Level 1

Duration: Initial 12 months with 2 x 12-month extensions

Industry: Information Technology

Functional Expertise: Data & Analytics

Sub-functional Expertise: Data Engineering

Location of work: ACT

Working arrangements: On-site, 5 days per week in Canberra

Closing date: 9AM Wednesday, 1 April 2026

Description:

Our client is seeking an experienced Lead Data Engineer to support the development, optimisation and migration of large-scale data platforms to cloud-based environments. The role focuses on building robust data pipelines, modern data architectures, and ensuring high standards of data governance, security, and performance.

You will play a key role in enabling advanced analytics and reporting capabilities, working closely with business intelligence teams and stakeholders to deliver scalable and secure data solutions.

Responsibilities:

  • Design and maintain scalable batch and real-time data pipelines
  • Develop and optimise ETL/ELT workflows using tools such as Data Factory, Spark, or Microsoft Fabric Data Pipelines
  • Build and manage data warehouses, data lakes, and Lakehouse architectures
  • Implement solutions using Microsoft Fabric (including OneLake, Data Engineering, Data Warehouse, and Real-Time Analytics)
  • Implement and optimise semantic models in Fabric integrated with Power BI
  • Collaborate with BI teams to transition reports from legacy platforms to Power BI
  • Ensure high data quality, integrity, and availability
  • Configure governance, lineage tracking, and monitoring within Fabric
  • Implement data security best practices aligned with government standards
  • Apply encryption, role-based access control (RBAC), auditing, and monitoring frameworks

Requirements:

  • Experience designing and delivering data engineering solutions in complex environments
  • Strong capability in building and optimising data pipelines and architectures
  • Experience with Microsoft Fabric and associated tools
  • Proficiency in ETL/ELT processes and modern data platforms
  • Experience managing large datasets and cloud-based data environments
  • Knowledge of data governance, lineage, and monitoring practices
  • Strong understanding of data security principles and implementation
  • Ability to collaborate with stakeholders and cross-functional teams
  • Analytical thinking and problem-solving skills
  • Experience with trade and statistical analysis software

Key Capabilities:

  • Strong stakeholder communication skills
  • Advanced analytical and problem-solving ability
  • Ability to lead data engineering initiatives and support delivery outcomes
  • Capability to work across both legacy and modern cloud-based environments

Essential Criteria:

  • Experience 7+ Years in a similar role.
  • Platforms Microsoft Fabric, Azure Data Factory
  • Storage OneLake, ADLS Gen2
  • Experience with Data modelling, Delta Lake architecture, Dataflows Gen2, Fabric pipelines, DevOps
  • Legacy Systems SSAS (Multidimensional), SQL Server
  • Experience implementing RLS and security best practices

Desirable Criteria:

  • Azure and DevOps
  • Experience with DevOps processes and tools such as Azure DevOps and Git, Azure SQL Databases

How to apply:

Please submit your application including your resume and a completed application form addressing the essential and desirable criteria. Ensure your application highlights your experience and any relevant certifications.