Data Engineer – GCP

Kaav Inc
Washington, US
Remote

Job Description

Job Title: Data Engineer – GCP

Location: Denver, CO (Remote)

Duration: Long-Term Contract

Need 15+ Years of experience required

Job Summary

seeking a highly skilled Data Engineer with deep expertise in Google Cloud Platform (GCP) and modern data architecture. The ideal candidate will have hands-on experience building scalable data pipelines, implementing Medallion Architecture, and delivering enterprise-grade data solutions.

Key Responsibilities

  • Design, develop, and maintain scalable batch and real-time data pipelines on GCP
  • Implement Medallion Architecture (Bronze, Silver, Gold layers)
  • Build high-performance data transformations using Python and PySpark
  • Develop and optimize complex SQL queries for analytics workloads
  • Work extensively with BigQuery for large-scale data processing
  • Develop and deploy pipelines using Cloud Dataflow
  • Orchestrate workflows using Cloud Composer (Airflow)
  • Manage data storage using Google Cloud Storage (GCS)
  • Implement CI/CD pipelines and version control (Git)
  • Ensure data security, governance, and access control using GCP IAM
  • Optimize solutions for performance, scalability, and cost-efficiency

Required Qualifications

  • Strong hands-on experience with Google Cloud Platform (GCP)
  • Expertise in BigQuery (partitioning, clustering, optimization)
  • Proven experience with Medallion Data Architecture
  • Strong programming skills in Python and PySpark
  • Advanced SQL skills (joins, window functions, tuning)
  • Experience with Cloud Dataflow
  • Experience with Cloud Composer (Airflow)
  • Experience with Google Cloud Storage (GCS)
  • Knowledge of Git and CI/CD pipelines
  • Strong understanding of GCP IAM and cloud security
  • Java (Mandatory)

Preferred Qualifications

  • Experience with enterprise-scale data platforms
  • Knowledge of data lakes and data warehousing concepts
  • Familiarity with real-time/streaming frameworks
  • Experience with data governance and data quality frameworks
  • Exposure to Agile/Scrum environments

Core Skills

  • Google Cloud Platform (GCP)
  • BigQuery
  • Medallion Architecture
  • Cloud Dataflow
  • Cloud Composer (Airflow)
  • Google Cloud Storage (GCS)
  • CI/CD Pipelines
  • GCP IAM
  • ETL/ELT Pipelines
  • Query & Pipeline Optimization
  • Java

Skills & Requirements

Technical Skills

Google cloud platform (gcp)BigqueryMedallion architecturePythonPysparkSqlCloud dataflowCloud composer (airflow)Google cloud storage (gcs)GitGcp iamData engineeringCloud computing

Employment Type

CONTRACT

Level

senior

Posted

5/6/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.