Google Cloud Data Engineer

NMI
Toronto, CA; US

Job Description

We are seeking a skilled Mid-Level Data Engineer to join our Data Platform team. At this level, you will be a hands-on executor and domain owner — responsible for building, maintaining, and improving the pipelines and data models that power analytics and business intelligence across the company. You will own specific areas of our BigQuery data warehouse end-to-end, delivering reliable data products within a framework set by senior and staff engineers.

This is not an architecture or strategy role — it is a role for someone who takes well-defined problems and executes them with high craft and reliability. You will work closely with data analysts, analytics engineers, and product teams, and are expected to grow toward greater technical ownership over time.

Key Duties

  • Build and maintain production-grade ELT pipelines that ingest data from internal applications, third-party SaaS tools, and event streams into our BigQuery data warehouse.
  • Own specific data domains end-to-end — from raw ingestion through to marts — ensuring your areas of the warehouse are accurate, tested, and well-documented.
  • Write and maintain dbt models, tests, macros, and documentation within our established dbt project conventions and code review process.
  • Develop and manage Airflow DAGs on Cloud Composer or other similar tools to orchestrate data workflows, following patterns and standards set by the team.
  • Implement data quality checks and monitoring to catch anomalies before they reach downstream consumers.
  • Optimize BigQuery queries and models for cost and performance within your domain, escalating architectural tradeoffs to senior engineers when appropriate.
  • Collaborate with analysts and stakeholders to translate business data needs into well-scoped pipeline and modeling tasks.
  • Participate in on-call rotations, respond to pipeline incidents, and write clear postmortems.
  • Contribute to team documentation and runbooks so that your work is maintainable by others.

Skills and experience

Required:

  • 3–5 years of experience in data engineering or a closely related data infrastructure role.
  • Proven experience designing and implementing scalable data pipelines and warehouse architectures.
  • Strong expertise in Google Cloud Platform (BigQuery, Cloud Storage, Cloud Composer, Pub/Sub, Dataflow).
  • Hands-on experience with dbt (data build tool) — models, tests, macros, sources, and documentation — at production scale.
  • Experience building and maintaining data pipelines with Apache Airflow or a comparable workflow orchestration tool.
  • Strong proficiency in SQL, including advanced BigQuery SQL (window functions, partitioning, clustering, query optimization).
  • Proficiency in Python for data engineering tasks, including API integrations, data processing scripts, and custom operators.
  • Familiarity with data modeling concepts: star schema, dimensional modeling, slowly changing dimensions (SCD).
  • Experience with version control (Git) and collaborative development workflows (pull requests, code review).
  • Understanding of data quality, lineage, and observability best practices.
  • Startup or growth-stage mindset — comfortable with ambiguity, rapid iteration, and evolving priorities.
  • Excellent communication skills, with the ability to collaborate effectively across technical and non-technical teams

Preferred:

  • Experience with Terraform or similar infrastructure-as-code tools for managing cloud data infrastructure.
  • Familiarity with streaming technologies such as GCP Pub/Sub, Dataflow, or Apache Kafka.
  • Knowledge of Looker, Tableau, or other BI tools and how data models power them.
  • Google Cloud Professional Data Engineer certification

Why join us:

  • Work on a modern, best-in-class GCP and BigQuery data stack with a high-performing team.
  • Influence data platform architecture decisions and grow into a senior or staff engineering role.
  • Competitive compensation, equity, and benefits with a culture that values engineering craft and continuous learning

As well as being a part of something exciting everyday, you will also receive the following benefits:

  • Annual salary + bonus
  • A remote first culture!
  • Flex PTO
  • Health, Dental and Vision Insurance
  • 13 Paid Holidays
  • Company volunteer days

What we do!

NMI enables our partners with choice, and challenges the one-size-fits-all approach to payments. You've probably used NMI in the last 24 hours without even realizing it. We're the platform that powers success for innovative tech created by SMBs, entrepreneurs and fintech startups. We're creative problem solvers who help visionaries smash through boundaries and think beyond what's possible so they can think about what's next. But we're not just built for the tech savvy. We democratize the latest payments technology so that everyone can realize the benefits of easy payments across the full spectrum of commerce. We're all about enabling more payments in more ways and more places.

We believe tha

Skills & Requirements

Technical Skills

Google cloud platformBigqueryCloud storageCloud composerPub/subDataflowDbtApache airflowSqlPythonTerraformCommunicationCollaborationData engineeringData infrastructureData pipelinesData warehouseData modelingData qualityData governance

Level

mid

Posted

4/21/2026

Apply Now

You will be redirected to NMI's application portal.