Data Engineering Team Lead
Client:
Financial Client
Role:
Data Engineering Team Lead
Job Type:
Permanent
Location:
Hybrid - 3 days/week – Downtown Toronto, ON
Salary
- CAD $135,000 - CAD $175,000
Your New Company
Our client, a very well-known financial company hiring a Data Engineering for a permanent opportunity.
Your New Role:
As a Team Lead, you will lead a technical team of data engineers, ensuring best practices in performance, security, and scalability while working on enterprise-wide Centralized Data Platform (CDP) built on Databricks.
This role requires a deep, hands-on understanding of Databricks internals and a track record of delivering large-scale data platforms in a cloud environment.
- You'll be part of a team driving technical excellence and innovation within data engineering practice.
- You will lead and mentor a team of data engineers, conducting code reviews, design reviews, and knowledge-sharing sessions across multiple locations
- Drive the Agile/Scrum SDLC process and collaborate with team members
- Design and develop Databricks solutions leveraging Lakehouse architecture for enterprise data processing and analytics
- Develop and optimize ETL/ELT pipelines
- Create and manage structured streaming pipelines for real-time data processing
- Configure and optimize Databricks clusters and Spark jobs for optimal performance
- Utilize Delta Live Tables for data ingestion and transformations
- Apply Unity Catalog features and IAM best practices for security governance and access control
- Support infrastructure and resource management using Terraform
- Implement monitoring solutions for pipeline performance and data quality
- Contribute to code reviews and knowledge-sharing sessions
What You’ll Need to Succeed:
- 8+ years of experience in data engineering
- 3+ years of hands-on experience with Databricks platform
- Proven experience leading a team
in Python, PySpark and Spark programming
- Demonstrable experience in using AI in development
- Proven experience with AWS or other similar cloud services
- Deep understanding of data modeling and SQL
- Experience with Delta Lake and Lakehouse architecture
- Strong knowledge of ETL/ELT principles and patterns
- Experience with version control systems (Git)
- Demonstrated ability to optimize data pipelines
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
Nice to Have:
- Financial services industry experience
- Experience with multiple cloud providers
- Knowledge of AI/ML implementation patterns
- API development experience
- Experience with real-time data processing
- Data governance framework experience
Technical Environment:
- Primary Platform: Databricks
- Cloud Platform: AWS (S3, Glue, Lambda)
- Tools: Delta Lake, Unity Catalog, Git
- Additional: Real-time processing, API integrations
What You’ll get in Return The client is offering a permanent opportunity with lucrative benefits.
This posting is for an existing vacancy with the organization.
AI may be used to screen, assess or select applicants for the position.