Lead Data Engineer, GFT

Royal Bank of Canada
Toronto, CA; US
On-site

Job Description

Job Description

What is the opportunity?

As Lead Data Engineer in the Data as a Service (DaaS) Team within Global Functions Technology, you will provide technical leadership and execution of development deliverables for the reimagined Finance Data Platform. The platform, leveraging public cloud infrastructure, will serve as the central repository of finance related datasets, with capabilities including the acquisition, standardization, enrichment, and provisioning, of positional/trade, sub-ledger and general-ledger trial balances, and reference data, and include capabilities supporting reconciliation, analytics, and reporting functions.

We are seeking a Data Engineer with extensive hands-on experience in designing and developing data platforms. This role requires strong data architecture and engineering skills, effective written and verbal communication, a strong work ethic, and the ability to multi-task effectively. Additionally, you must possess strong interpersonal, organizational, and problem-solving skills, along with a demonstrated sense of urgency to respond to changing priorities.

What will you do?

This role will encompass an end-to-end system view from data sourcing, lineage, transformation, and storage standpoint to support complex advanced analytics and data management and governance needs and require extensive collaboration with Business architecture, System architecture, Business SME and Data Stewards.

  • Work closely with the data team and other stakeholders to understand their data requirements and assist in building data solutions that cater to their needs.
  • Design, develop, and support new and existing data pipelines, recommending improvements and modifications.
  • Communicate strategies and processes around data modeling and architecture to cross-functional groups.
  • Identify, design, and implement internal process improvements.
  • Utilize your expertise in data transformation techniques to enrich raw data, making it more accessible and valuable for analytics and reporting purposes.
  • Build and implement ETL frameworks to improve code quality and reliability.
  • Develop scripts and programs for converting various types of data into usable formats.
  • Ensure the accuracy and consistency of data processing, results, and reporting.

What do you need to succeed?

Must Have:

  • Undergraduate degree/diploma in computer science/engineering or related technology discipline.
  • 7+ years of experience in data engineering with at least 3 years of hands-on experience in system integration, data engineering and cloud architecture, and tools & technologies listed below:
  • Big-Data/Lakehouse platforms as Cloudera Data Platform, Microsoft Azure, AWS
  • Data transformation tools/technologies/platforms such as: Data Build Tool (DBT), Databricks, Snowflake.
  • Orchestration: Apache Airflow/ Stonebranch UAC/Control-M
  • Cloud and Containers: OpenShift (OCP)/Docker/Kubernetes
  • API Development: FastAPI, or other API framework
  • Security: LDAP, Kerberos, OAuth 2.0, Microsoft Entra ID, HashiCorp Vault integration
  • DevOps: GitHub Actions
  • Programming languages: Python/Scala/Java
  • Demonstrable experience with data management and governance practices.
  • Demonstrable experience with SQL with any ANSI-compliant RDBMS.
  • Demonstrable experience with generative-AI tools/technologies.

Nice to have:

  • Experience with/exposure to migrating data platforms from on-premises to off-premises (cloud) infrastructure.
  • Experience with Capital Markets or other financial technology services’ middle/back-office environments.
  • Experience with/exposure to financial services’ products.

What’s in it for you?

We thrive on the challenge to be our best, progressive thinking to keep growing, and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • A comprehensive Total Rewards Program including bonuses and flexible benefits, competitive compensation, commissions, and stock where applicable.
  • Leaders who support your development through coaching and managing opportunities.
  • Ability to make a difference and lasting impact.
  • Work in a dynamic, collaborative, progressive, and high-performing team.
  • A world-class training program in financial services.
  • Flexible work/life balance options.
  • Opportunities to do challenging work.

#LI-POST

#TECHPJ

Job Skills

Big Data Management, Cloud Computing, Database Development, Data Mining, Data Warehousing (DW), ETL Processing, Group Problem Solving, Quality Management, Requirements Analysis

Additional Job Details

Address:

RBC CENTRE, 155 WELLINGTON ST W:TORONTOCity:

TorontoCountry:

CanadaWork hours/week:

37.5Employment Type:

Full timePlatform:

TECHNOLOGY AND OPERATIONSJob Type:

RegularPay Type:

SalariedPosted Date:

2026-04-29Application Deadline:

2026-05-31Note: Applications will be accepted

Skills & Requirements

Technical Skills

Big-data/lakehouse platformsData build tool (dbt)DatabricksSnowflakeApache airflowStonebranch uac/control-mOpenshift (ocp)DockerKubernetesFastapiLdapKerberosOauth 2.0Microsoft entra idHashicorp vault integrationGithub actionsPythonScalaJavaSqlGenerative-ai tools/technologiesLeadershipCommunicationOrganizationalProblem-solvingSense of urgencyFinanceData managementData governance

Employment Type

FULL TIME

Level

lead

Posted

4/30/2026

Apply Now

You will be redirected to Royal Bank of Canada's application portal.

Sign in and we'll score your resume against this role.