Senior Data Engineer (GCP, Databricks) Toronto_hybrid

BURGEON IT SERVICES
Toronto, CA; US
Hybrid

Job Description

Senior Data Engineer (GCP, Databricks)

Location: Toronto_hybrid

Experience Required: 10+ Years

Please share the resumes with me at pranay@burgeonits.com

Role Overview

We are looking for a Senior Data Engineer to design and build scalable batch and real-time data pipelines using modern cloud and big data technologies.

Key Skills (Must Have)

  • GCP (Google Cloud Platform)
  • Databricks
  • Python
  • Kubernetes
  • Apache Spark (Spark Streaming)
  • Kafka / Flink (real-time streaming)
  • Hadoop ecosystem (Hive, Pig, Spark)

Core Responsibilities

  • Build and maintain data pipelines (batch & real-time)
  • Develop streaming solutions using Spark/Kafka/Flink
  • Implement Data Lakehouse & Medallion architecture
  • Work with microservices-based data platforms
  • Set up and manage CI/CD pipelines & DevOps workflows
  • Collaborate with stakeholders and ensure data quality

Nice to Have

  • Unity Catalog
  • Terraform
  • Experience with AI coding tools (GitHub Copilot, Claude)

Skills & Requirements

Technical Skills

GcpDatabricksPythonKubernetesApache sparkKafkaFlinkHadoop ecosystemHivePigUnity catalogTerraformAi coding toolsCloudBig dataData pipelinesStreaming solutionsData lakehouseMedallion architectureMicroservicesCi/cd pipelinesDevops workflows

Level

senior

Posted

4/8/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.