SENIOR DATABRICKS ENGINEER

We are seeking a Senior Databricks Engineer to join our growing team to design, build, and optimize modern data platforms on Azure using Databricks. You will work directly with client data and analytics teams, helping them migrate, modernize, and scale their data workloads while applying best practices in data engineering, security, and governance. This is a hands-on engineering role with strong exposure to architecture decisions, client interaction, and end-to-end delivery.

Responsibilities

  • Lead the implementation of modern data platforms and architectures on Databricks (including ETL, workload migrations, and Unity Catalog).

  • Design and implement data pipelines using Azure Databricks (PySpark, SQL)

  • Build and optimize batch and streaming data workloads

  • Migrate legacy data workloads to Azure + Databricks

  • Implement Delta Lake patterns (medallion architecture, CDC, data quality)

  • Integrate Databricks with Azure services such as ADLS, ADF, Azure Key Vault and DevOps/Github

  • Optimize performance and cost (cluster sizing, job orchestration, query tuning)

  • Collaborate with solution architects, analytics engineers, and client stakeholders

  • Contribute to reusable accelerators, standards, and internal best practices

  • Support client enablement through knowledge transfer and documentation

  • Provide hands-on solution delivery, including guiding and working closely with client engineers and ensuring best practices.

  • Implement governance models and Unity Catalog including data access, lineage, and security frameworks.

  • Evaluate and prioritize high-value AI and ML use cases, embedding Generative AI into client strategies.

  • Act as a thought leader by contributing to client workshops, executive roundtables, and industry discussions.

QUALIFICATIONS

  • 5–7+ years of experience in data engineering.

  • 3+ years of hands-on experience with Databricks, including advanced features (Delta Lake, Unity Catalog, MLflow).

  • Proven experience leading large-scale data migrations (ETL, workloads, cloud platforms).

  • Strong expertise in Azure environments. Multi-cloud experience is an asset.

  • Experience working with Azure data services (ADLS, ADF, Synapse, etc.)

  • Solid understanding of modern data architectures (lakehouse, medallion, ELT/ETL)

  • Experience with CI/CD for data workloads

  • Strong communication skills and comfort working directly with clients

  • Excellent understanding of data governance, security, and compliance in enterprise environments.

  • Consulting experience is strongly preferred.

  • Databricks or cloud certifications required.

Why Work with Us?

Be part of Canada’s leading boutique consulting firm focused on Databricks and modern data platforms.

  • Work on challenging, high-impact projects with mid-sized and enterprise clients across industries,

  • and collaborate with a senior, high-performing team that values speed, pragmatism, and client success.