Job title: Lead Data Engineers
Job type: Contract
Emp type: Full-time
Pay interval: Hourly
Pay rate: negotiable
Job published: 31/03/2026

Job Description

Job Title: Lead Data Engineer

Security Clearance: Negative Vetting Level 1

Duration: Initial 12 months with 2 x 12-month extensions

Industry: Information Technology

Functional Expertise: Data and Analytics

Sub-functional Expertise: Data Engineering / Power BI Development

Location of work: ACT

Working arrangements: On-site, 5 days per week in office

Closing date: 9AM Thursday, 9 April 2026

Description:

Our client is seeking two experienced Lead Data Engineers to support enterprise data and reporting capabilities. The role focuses on developing and maintaining Power BI solutions and data platforms, including modern cloud-based data architectures. You will contribute to both project delivery and ongoing operational support, working across large-scale datasets and cloud migration initiatives.

Responsibilities:

  • Design, develop and deploy interactive dashboards and reports using Microsoft Power BI
  • Build and manage enterprise semantic models using Fabric Lakehouse and Warehouse
  • Develop and optimise data models using DAX and Power Query (M language)
  • Implement data ingestion and transformation pipelines using Fabric Data Factory
  • Work with Lakehouse architecture leveraging Delta tables
  • Develop notebooks and manage data transformations within Fabric environments
  • Implement row-level security (RLS) and data governance best practices
  • Perform Power BI Service administration including tenancy settings and capacity management
  • Implement and maintain role-based access control (RBAC) across platforms
  • Optimise performance of reports, datasets and queries
  • Collaborate with stakeholders to translate business requirements into analytics solutions
  • Manage licensing assignment and optimisation strategies
  • Support CI/CD deployment pipelines for BI artefacts
  • Ensure data quality, accuracy and documentation standards
  • Manage gateway configurations and on-premises connectivity

Requirements:

  • Strong experience developing Power BI dashboards and reports
  • Proven capability in building and managing semantic data models
  • Experience with Fabric Lakehouse, Warehouse and Data Factory
  • Advanced skills in DAX, Power Query (M) and SQL
  • Experience with data modelling and modern data architectures
  • Knowledge of Azure ecosystem including Data Lake, Azure SQL and Key Vault
  • Experience implementing security controls including RLS and RBAC
  • Familiarity with Power BI tenancy administration and licensing
  • Ability to optimise performance across datasets and reporting solutions
  • Experience supporting CI/CD pipelines for BI and data solutions

Key Capabilities:

  • Data management (SFIA Level 4/5)
  • Data visualisation (SFIA Level 4/5)
  • Data engineering (SFIA Level 4/5)

Essential Criteria:

  • 5+ Years in a similar role.
  • Strong proficiency in:
    • DAX
    • Power Query (M)
    • SQL
  • Experience with:
    • Data modelling
    • Delta Lake architecture
    • Dataflows Gen2
    • Fabric pipelines
  • Knowledge of Azure ecosystem (Azure Data Lake, Azure SQL, Entra Key Vault).
  • Experience implementing RLS and security best practices.
  • Knowledge of Microsoft Licensing, Role and Power BI Tenancy Administration.

Desirable Criteria:

  • Azure and DevOps: Experience with DevOps processes and tools such as Azure DevOps and Git Azure SQL Databases.
  • Hands-on experience with Microsoft Fabric (Lakehouse, Warehouse, Data Factory).

How to apply:

Please submit your application including your resume and a completed application form addressing the essential and desirable criteria. Ensure your application highlights your experience and any relevant certifications.