Job title: Lead Data Engineer
Job type: Contract
Emp type: Full-time
Pay interval: Hourly
Pay rate: negotiable
Job published: 22/04/2025

Job Description

Job Title: Lead Data Engineer

Security Clearance: Minimum NV1 or the ability to obtain NV1 clearance

Duration: Initial 12 months with 2 x 12-month extensions

Industry: Information Technology

Functional Expertise: Data Engineering

Sub-functional Expertise: Big Data Technologies, Database Design, Data Analysis

Location of work: ACT

Working arrangements: Hybrid – onsite in Canberra with up to 3 days per week remote work negotiable after 6 months

Closing date: 9AM Tuesday, 29 April 2025

Description:

Our client is seeking a skilled Lead Data Engineer to support the delivery of critical data infrastructure and analytics capabilities. You will join a multidisciplinary team working on the enhancement of national data systems and reporting mechanisms, with a focus on improving data quality and accessibility.

Responsibilities:

  • Provide analytical support and data quality reporting across national systems

  • Collaborate with internal teams, stakeholders, and external partners

  • Design and analyse database queries, collection systems, analytics, and reporting mechanisms

  • Acquire, maintain, and manage datasets from varied sources

  • Provide guidance aligned to strategic ICT goals

  • Develop and maintain comprehensive data pipelines and infrastructure

  • Identify trends and improve operational processes

  • Support development and testing teams in delivering solutions

  • Deliver high-level briefings and documentation

Requirements:

  • Strong background in preparing, managing, and analysing large, complex datasets

  • Ability to identify and resolve code/data quality issues

  • Proven capability to prioritise and meet business and information needs

  • Expertise in improving internal processes and automating workflows

  • Competency in working with cross-functional teams and stakeholders

Key Capabilities:

  • Data pipeline architecture development

  • Use of batch and stream processing tools (e.g., Spark)

  • Data modelling and analytics across diverse data sources

  • Effective communication with senior executives and stakeholders

  • Advanced problem-solving and statistical analysis skills

Essential Criteria:

  • 3+ years’ demonstrated experience researching, selecting, and integrating Big Data technology stacks including batch/stream processing tools such as Spark and indexing technologies such as Elastic and Solr

  • Experience modelling, extracting, loading, and analysing data from diverse sources including flat files, SQL, and NoSQL; with advice on enhancing scalability and performance

  • Experience in stakeholder engagement, including inter-agency collaboration

  • Relevant tertiary qualifications or equivalent experience

  • Flexibility over the lifecycle of the engagement and consideration of environmental sustainability

How to apply:

Please submit your application including your resume and a completed application form addressing the essential and desirable criteria. Ensure your application highlights your experience and any relevant certifications.