Job Description
Job Title: Data Architect – EL1
Security Clearance: Must hold or be able to obtain Negative Vetting Level 1
Duration: 6 months
Industry: Information Technology
Functional Expertise: Data Architecture
Sub-functional Expertise: Azure Databricks, Enterprise Data Platforms, Data Governance
Location of work: Canberra ACT | Hybrid
Working arrangements: The work is to be undertaken in Canberra ACT with an expectation of 5 days per week (8 hours per day) Monday to Friday excluding public holidays. Flexible working arrangements can be negotiated.
Closing date: 9AM Monday, 25 May 2026
Description:
Our client is seeking an experienced Data Architect to support the implementation of an Enterprise Data Platform (EDP). The successful candidate will work closely with internal stakeholders and delivery partners to define, govern, and implement target data architecture requirements for the EDP.
The role will focus on translating approved EDP requirements into implementation-ready data architecture, including conformed dimensions, fact and event models, canonical entity structures, cross-dataset integration patterns, medallion layer standards, source-to-target dataset specifications, and data governance controls.
The successful candidate will also provide architecture governance and delivery assurance throughout implementation, ensuring alignment with business, security, compliance, and operational requirements while supporting sustainable handover to internal teams.
Responsibilities:
- Lead completion of the target data architecture required for Phase 1 EDP implementation and near-term expansion.
- Define shared conformed dimensions, canonical entities, surrogate key strategy, and crosswalk rules across priority datasets.
- Define fact and event models, point-in-time reporting strategy, and reusable analytical outputs to support corporate reporting, operational reporting, and workload forecasting.
- Define bronze, silver, and gold standards by dataset and ensure the role of each layer is clear, consistent, and implementable.
- Prepare implementation-ready dataset packs for priority sources, including source profile, schema, refresh pattern, mappings, business rules, data quality rules, access constraints, retention/destruction requirements, lineage needs, and acceptance criteria.
- Define cross-dataset integration patterns, including relationship modelling and entity resolution inputs and outputs.
- Define architecture requirements for secure self-service access, search, data export, and reusable data products in alignment with role-based access, masking, and audit requirements.
- Review solution designs, dataset designs, build artefacts, release artefacts, environment/resource management plans, and operational procedures to ensure alignment with approved architecture and EDP requirements.
- Define architecture guardrails for environments, DevOps, modular pipeline design, reusable components, schema evolution, and non-disruptive delivery.
- Support migration design for priority existing workflows, dashboards, and reports into the EDP.
- Support transition and decommission architecture for migrated workflows and legacy data processes, including archive and dependency considerations.
- Support service transition and production support readiness by ensuring architecture artefacts and delivered data products are supportable, documented, and sustainable.
- Produce architecture artefacts, standards, and documentation required for delivery, acceptance, and ongoing sustainment.
- Support knowledge transfer, handover, and internal capability uplift so the delivered platform can be sustainably operated and evolved.
Requirements:
- Demonstrated experience designing and governing implementation of enterprise data architectures on modern cloud data platforms, preferably Azure Databricks within an Azure environment.
- Strong practical capability in dimensional modelling, conformed dimensions, fact and event modelling, canonical entity modelling, and cross-dataset integration design.
- Demonstrated experience preparing datasets for platform implementation, including source-to-target mappings, transformation rules, medallion design, data quality controls, and acceptance criteria.
- Strong knowledge of secure data platform design including RBAC, SSO integration concepts, masking, audit logging, lineage, retention and destruction controls, and handling of sensitive or agreement-controlled datasets.
- Experience reviewing and assuring implementation deliverables produced by third-party integrators, vendors, or delivery partners.
- Demonstrated understanding of Dev/Test/Prod architecture, Git-based version control, modular pipeline design, CI/CD guardrails, and service transition considerations for data platforms.
- Ability to translate business and operational needs into clear data architecture decisions, standards, and implementation artefacts.
- Strong stakeholder engagement and technical leadership skills, including the ability to work across business, ICT, security, and delivery stakeholders in a complex environment.
- Strong written communication skills and ability to produce clear, implementation-ready architecture artefacts, data models, standards, review outputs, and transition documentation.
- Demonstrated professionalism, sound judgement, and a collaborative, flexible approach to supporting delivery outcomes in a dynamic environment.
Key Capabilities:
- Enterprise data architecture design
- Azure Databricks architecture
- Data governance and compliance
- Dimensional and canonical data modelling
- Cross-dataset integration
- Data platform security controls
- DevOps and CI/CD governance
- Stakeholder engagement and technical leadership
- Documentation and knowledge transfer
Essential Criteria:
- Demonstrated experience designing and governing implementation of enterprise data architectures on modern cloud data platforms, preferably Azure Databricks within an Azure environment.
- Strong practical capability in dimensional modelling, conformed dimensions, fact and event modelling, canonical entity modelling, and cross-dataset integration design.
- Demonstrated experience preparing datasets for platform implementation, including source-to-target mappings, transformation rules, medallion design, data quality controls, and acceptance criteria.
- Strong knowledge of secure data platform design including RBAC, SSO integration concepts, masking, audit logging, lineage, retention and destruction controls, and handling of sensitive or agreement-controlled datasets.
- Experience reviewing and assuring implementation deliverables produced by third-party integrators, vendors, or delivery partners.
- Demonstrated understanding of Dev/Test/Prod architecture, Git-based version control, modular pipeline design, CI/CD guardrails, and service transition considerations for data platforms.
- Ability to translate business and operational needs into clear data architecture decisions, standards, and implementation artefacts.
- Strong stakeholder engagement and technical leadership skills, including the ability to work across business, ICT, security, and delivery stakeholders in a complex environment.
- Strong written communication skills and ability to produce clear, implementation-ready architecture artefacts, data models, standards, review outputs, and transition documentation.
- Professional and collaborative delivery – demonstrated professionalism, sound judgement, and a collaborative, flexible approach to supporting delivery outcomes in a dynamic environment, including the ability to contribute across related data architecture, governance, and implementation priorities as required.
Desirable Criteria:
- Experience working in law enforcement, integrity, intelligence, regulatory, or other high-sensitivity public sector environments.
- Experience designing architecture for entity resolution, network analysis, graph-style relationship modelling, or investigative analytics.
- Experience working with protected or tightly controlled datasets governed by MoUs, statutory controls, or similar data-sharing conditions.
- Experience supporting migration of operational reporting and analytics from fragmented or manual environments into a governed enterprise data platform.
- Experience contributing to internal capability uplift, documentation, train-the-trainer style handover, or service transition readiness.
How to apply:
Please submit your application including your resume and a completed application form addressing the essential and desirable criteria. Ensure your application highlights your experience and any relevant certifications.