Our client is looking to hire a Remote Data Engineer.
Must be a US citizen or Green Card Holder.
What You’ll Do
Join our Data Platform team and own real pieces of our data warehouse end-to-end. You’ll build reliable pipelines and data models that power analytics across the business—working hands-on with modern tools like BigQuery, dbt, and Airflow. This is a highly execution-focused role: you’ll take well-defined problems and turn them into clean, scalable, production-ready data solutions.
Core Responsibilities
- Build and maintain ELT pipelines from internal systems, SaaS tools, and event streams into BigQuery
- Own data domains end-to-end (ingestion → transformation → marts)
- Develop and maintain dbt models, tests, and documentation
- Orchestrate workflows using Airflow / Cloud Composer
- Implement data quality checks and monitoring
- Optimize queries and models for performance and cost
- Partner with analysts and stakeholders to deliver reliable datasets
- Support on-call rotation and resolve pipeline issues
What You Bring
- 3–5 years in Data Engineering or Data Infrastructure
- Strong experience with GCP (BigQuery, Cloud Storage, Composer, Pub/Sub, Dataflow)
- Hands-on experience with dbt in production environments
- Solid experience with Airflow or similar orchestration tools
- Advanced SQL skills (performance tuning, partitioning, etc.)
- Proficiency in Python for data pipelines and integrations
- Experience with data modeling (star schema, SCDs, dimensional modeling)
- Familiarity with Git + code review workflows
- Understanding of data quality and observability best practices
Why This Role
- Work on a modern GCP + BigQuery + dbt stack
- Own meaningful parts of the data platform—not just tickets
Nice to Have
- Terraform or infrastructure-as-code experience
- Streaming experience (Pub/Sub, Dataflow, Kafka)
- Exposure to BI tools like Looker or Tableau
- GCP Data Engineer certification