Senior Data Engineer (MS Fabric/Databricks)

Senior Data Engineer (MS Fabric/Databricks)

Contract Type:

Permanent

Location:

melbourne

Industry:

Contact Name:

Michael Mooney

Contact Email:

michael.mooney@methodrecruitment.com.au

Contact Phone:

0413245023

Posted Date:

13-May-2026

Principal / Senior Data Engineer – Fabric Greenfield Migration (Utilities & Critical Infrastructure)

We are currently seeking an experienced Data Engineer to play a key role in delivering a greenfield Microsoft Fabric data platform for critical infrastructure sectors, including electricity, gas, and water utilities.

This role is centred on a large-scale migration to Fabric, offering a rare opportunity to help design and build a modern, cloud-native data platform from the ground up. You’ll be working at the intersection of advanced analytics, operational systems, and next-generation data architecture, contributing to initiatives that support infrastructure modernisation at scale.


About the Role

You will join a high-performing team of data engineers and data scientists, working with high-frequency telemetry, geospatial datasets, asset registries, and operational logs.

The focus of the role is to shape and deliver a new Fabric-based platform, establishing best practices, patterns, and foundations that will underpin future analytics and data products. While exposure to Databricks is valuable, the primary objective is building out a Fabric-first ecosystem.

This is a hands-on engineering role with significant technical ownership and impact.


Key Responsibilities

  • Lead the design and build of greenfield Microsoft Fabric data platform components
  • Develop scalable, robust data pipelines supporting analytics, reporting, and ML use cases
  • Implement metadata-driven ingestion and transformation frameworks
  • Apply modern DevOps practices (Git, CI/CD, automated testing, IaC)
  • Contribute to architecture, standards, and platform design decisions
  • Collaborate closely with data scientists and stakeholders to deliver production-ready data solutions

Technology Environment

  • Primary focus: Microsoft Fabric (core platform)
  • Azure Data Factory, Spark
  • Databricks (Delta Lake, Unity Catalog) – supplementary / legacy / adjacent workloads
  • Cloud-native lakehouse architectures
  • CI/CD pipelines and DevOps tooling
  • Exposure to machine learning and emerging GenAI / agent-based use cases

Required Experience

  • Strong hands-on experience with Microsoft Fabric and Azure-native data services
  • Deep expertise in Spark, Python, and SQL
  • Proven experience building production-grade data pipelines (e.g., CDC/CDF, SCD Type 2, control frameworks, data quality and load assurance)
  • Strong understanding of lakehouse architectures and modern data platform design
  • Experience contributing to platform builds, migrations, or greenfield environments
  • Solid DevOps capability (CI/CD, automated deployments, Git workflows)

Desirable Skills

  • Experience working on Fabric migrations or platform modernisation programs
  • Exposure to Databricks ecosystems (Delta Lake, Unity Catalog)
  • Experience operating in Agile delivery environments
  • Familiarity with ML / GenAI workflows and unstructured data
  • Experience with Power BI or Tableau

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related quantitative field (Master’s or PhD preferred)
APPLY NOW

Share this job

Interested in this job?
Save Job
Create As Alert

Similar Jobs

Read More
SCHEMA MARKUP ( This text will only show on the editor. )

Contract Type:

Permanent

Location:

Industry:

Contact Name:

Michael Mooney

Contact Email:

michael.mooney@methodrecruitment.com.au

Contact Phone:

0413245023

Date Published:

13-May-2026

Principal / Senior Data Engineer – Fabric Greenfield Migration (Utilities & Critical Infrastructure)

We are currently seeking an experienced Data Engineer to play a key role in delivering a greenfield Microsoft Fabric data platform for critical infrastructure sectors, including electricity, gas, and water utilities.

This role is centred on a large-scale migration to Fabric, offering a rare opportunity to help design and build a modern, cloud-native data platform from the ground up. You’ll be working at the intersection of advanced analytics, operational systems, and next-generation data architecture, contributing to initiatives that support infrastructure modernisation at scale.


About the Role

You will join a high-performing team of data engineers and data scientists, working with high-frequency telemetry, geospatial datasets, asset registries, and operational logs.

The focus of the role is to shape and deliver a new Fabric-based platform, establishing best practices, patterns, and foundations that will underpin future analytics and data products. While exposure to Databricks is valuable, the primary objective is building out a Fabric-first ecosystem.

This is a hands-on engineering role with significant technical ownership and impact.


Key Responsibilities

  • Lead the design and build of greenfield Microsoft Fabric data platform components
  • Develop scalable, robust data pipelines supporting analytics, reporting, and ML use cases
  • Implement metadata-driven ingestion and transformation frameworks
  • Apply modern DevOps practices (Git, CI/CD, automated testing, IaC)
  • Contribute to architecture, standards, and platform design decisions
  • Collaborate closely with data scientists and stakeholders to deliver production-ready data solutions

Technology Environment

  • Primary focus: Microsoft Fabric (core platform)
  • Azure Data Factory, Spark
  • Databricks (Delta Lake, Unity Catalog) – supplementary / legacy / adjacent workloads
  • Cloud-native lakehouse architectures
  • CI/CD pipelines and DevOps tooling
  • Exposure to machine learning and emerging GenAI / agent-based use cases

Required Experience

  • Strong hands-on experience with Microsoft Fabric and Azure-native data services
  • Deep expertise in Spark, Python, and SQL
  • Proven experience building production-grade data pipelines (e.g., CDC/CDF, SCD Type 2, control frameworks, data quality and load assurance)
  • Strong understanding of lakehouse architectures and modern data platform design
  • Experience contributing to platform builds, migrations, or greenfield environments
  • Solid DevOps capability (CI/CD, automated deployments, Git workflows)

Desirable Skills

  • Experience working on Fabric migrations or platform modernisation programs
  • Exposure to Databricks ecosystems (Delta Lake, Unity Catalog)
  • Experience operating in Agile delivery environments
  • Familiarity with ML / GenAI workflows and unstructured data
  • Experience with Power BI or Tableau

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related quantitative field (Master’s or PhD preferred)
APPLY NOW

Posted Date

Location

Sector

Salary

Work Type

13-May-2026

Open

Permanent

Apply Now

Share this job

Interested in this job?
Save Job

Posted Date:

13-May-2026

Location:

melbourne

Sector:

Data Engineer

Salary:

Work Type:

Permanent

Principal / Senior Data Engineer – Fabric Greenfield Migration (Utilities & Critical Infrastructure)

We are currently seeking an experienced Data Engineer to play a key role in delivering a greenfield Microsoft Fabric data platform for critical infrastructure sectors, including electricity, gas, and water utilities.

This role is centred on a large-scale migration to Fabric, offering a rare opportunity to help design and build a modern, cloud-native data platform from the ground up. You’ll be working at the intersection of advanced analytics, operational systems, and next-generation data architecture, contributing to initiatives that support infrastructure modernisation at scale.


About the Role

You will join a high-performing team of data engineers and data scientists, working with high-frequency telemetry, geospatial datasets, asset registries, and operational logs.

The focus of the role is to shape and deliver a new Fabric-based platform, establishing best practices, patterns, and foundations that will underpin future analytics and data products. While exposure to Databricks is valuable, the primary objective is building out a Fabric-first ecosystem.

This is a hands-on engineering role with significant technical ownership and impact.


Key Responsibilities

  • Lead the design and build of greenfield Microsoft Fabric data platform components
  • Develop scalable, robust data pipelines supporting analytics, reporting, and ML use cases
  • Implement metadata-driven ingestion and transformation frameworks
  • Apply modern DevOps practices (Git, CI/CD, automated testing, IaC)
  • Contribute to architecture, standards, and platform design decisions
  • Collaborate closely with data scientists and stakeholders to deliver production-ready data solutions

Technology Environment

  • Primary focus: Microsoft Fabric (core platform)
  • Azure Data Factory, Spark
  • Databricks (Delta Lake, Unity Catalog) – supplementary / legacy / adjacent workloads
  • Cloud-native lakehouse architectures
  • CI/CD pipelines and DevOps tooling
  • Exposure to machine learning and emerging GenAI / agent-based use cases

Required Experience

  • Strong hands-on experience with Microsoft Fabric and Azure-native data services
  • Deep expertise in Spark, Python, and SQL
  • Proven experience building production-grade data pipelines (e.g., CDC/CDF, SCD Type 2, control frameworks, data quality and load assurance)
  • Strong understanding of lakehouse architectures and modern data platform design
  • Experience contributing to platform builds, migrations, or greenfield environments
  • Solid DevOps capability (CI/CD, automated deployments, Git workflows)

Desirable Skills

  • Experience working on Fabric migrations or platform modernisation programs
  • Exposure to Databricks ecosystems (Delta Lake, Unity Catalog)
  • Experience operating in Agile delivery environments
  • Familiarity with ML / GenAI workflows and unstructured data
  • Experience with Power BI or Tableau

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related quantitative field (Master’s or PhD preferred)

Share this job

Apply Now

Share this job

Interested in this job?
Save Job
Create As Alert

Similar Jobs

Read More
SCHEMA MARKUP ( This text will only show on the editor. )

Principal / Senior Data Engineer – Fabric Greenfield Migration (Utilities & Critical Infrastructure)

We are currently seeking an experienced Data Engineer to play a key role in delivering a greenfield Microsoft Fabric data platform for critical infrastructure sectors, including electricity, gas, and water utilities.

This role is centred on a large-scale migration to Fabric, offering a rare opportunity to help design and build a modern, cloud-native data platform from the ground up. You’ll be working at the intersection of advanced analytics, operational systems, and next-generation data architecture, contributing to initiatives that support infrastructure modernisation at scale.


About the Role

You will join a high-performing team of data engineers and data scientists, working with high-frequency telemetry, geospatial datasets, asset registries, and operational logs.

The focus of the role is to shape and deliver a new Fabric-based platform, establishing best practices, patterns, and foundations that will underpin future analytics and data products. While exposure to Databricks is valuable, the primary objective is building out a Fabric-first ecosystem.

This is a hands-on engineering role with significant technical ownership and impact.


Key Responsibilities

  • Lead the design and build of greenfield Microsoft Fabric data platform components
  • Develop scalable, robust data pipelines supporting analytics, reporting, and ML use cases
  • Implement metadata-driven ingestion and transformation frameworks
  • Apply modern DevOps practices (Git, CI/CD, automated testing, IaC)
  • Contribute to architecture, standards, and platform design decisions
  • Collaborate closely with data scientists and stakeholders to deliver production-ready data solutions

Technology Environment

  • Primary focus: Microsoft Fabric (core platform)
  • Azure Data Factory, Spark
  • Databricks (Delta Lake, Unity Catalog) – supplementary / legacy / adjacent workloads
  • Cloud-native lakehouse architectures
  • CI/CD pipelines and DevOps tooling
  • Exposure to machine learning and emerging GenAI / agent-based use cases

Required Experience

  • Strong hands-on experience with Microsoft Fabric and Azure-native data services
  • Deep expertise in Spark, Python, and SQL
  • Proven experience building production-grade data pipelines (e.g., CDC/CDF, SCD Type 2, control frameworks, data quality and load assurance)
  • Strong understanding of lakehouse architectures and modern data platform design
  • Experience contributing to platform builds, migrations, or greenfield environments
  • Solid DevOps capability (CI/CD, automated deployments, Git workflows)

Desirable Skills

  • Experience working on Fabric migrations or platform modernisation programs
  • Exposure to Databricks ecosystems (Delta Lake, Unity Catalog)
  • Experience operating in Agile delivery environments
  • Familiarity with ML / GenAI workflows and unstructured data
  • Experience with Power BI or Tableau

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related quantitative field (Master’s or PhD preferred)

Share this job

Create As Alert

Similar Jobs

Read More
SCHEMA MARKUP ( This text will only show on the editor. )