Lead Data Engineer (Snowflake)

Prairie Consulting Services
Chicago, US
Hybrid

Job Description

Senior / Lead Snowflake Data Engineer (Greenfield Build, Streaming, Multi-Cloud)

Location: Downtown Chicago, Hybrid, 3dys/week to work

Overview

We are seeking a Senior Snowflake Data Engineer with proven greenfield experience—someone who has designed and built data platforms from scratch, not just enhanced existing systems.

You will play a critical role in foundational architecture, building scalable data pipelines, and establishing best practices across a modern Snowflake-based ecosystem spanning AWS and Azure.

What You’ll Do

  • Architect and implement a new Snowflake data warehouse environment from scratch
  • Build end-to-end ingestion frameworks supporting:

Batch/file-based ingestion (S3, Blob Storage)

Real-time streaming (Snowpipe, Kafka, Event Hubs)

  • Develop scalable transformations using Snowflake SQL, stored procedures, and Python
  • Define and implement data models (dimensional/star schema) aligned with business needs
  • Establish data quality, monitoring, and observability frameworks
  • Optimize Snowflake performance and cost efficiency
  • Integrate with AWS (S3, Lambda, CloudWatch) and Azure (ADF, Event Hubs, Functions)
  • Implement CI/CD pipelines and Infrastructure as Code (Terraform/CloudFormation)
  • Collaborate with stakeholders to translate undefined requirements into scalable solutions
  • Mentor team members and help define data engineering standards

Required Expertise

  • 5–8+ years in Data Engineering with 3+ years strong Snowflake experience
  • Proven track record of greenfield data platform implementations

Deep expertise in:

Advanced SQL & Snowflake optimization

Batch + streaming data pipelines

Data modeling and warehouse design

  • Hands-on experience with both AWS and Azure ecosystems
  • Strong Python programming skills
  • Experience delivering production-grade, scalable data systems

Preferred (What Sets You Apart)

  • Snowflake Certifications (SnowPro Core / Advanced)
  • Experience building real-time streaming architectures using Kafka or similar technologies
  • Strong experience with Terraform or Infrastructure as Code
  • Familiarity with Docker, Kubernetes, and modern DevOps practices
  • Experience with data observability / quality tools (Monte Carlo, Great Expectations, etc.)
  • Exposure to BI tools (Tableau, Power BI, Looker)
  • Prior experience in high-scale enterprise data environments

What We’re Looking For

  • Engineers who own problems end-to-end, not just write code
  • Strong systems thinking and architectural mindset
  • Ability to work across multi-cloud ecosystems
  • Someone who can optimize, scale, and modernize data platforms, not just maintain them

Skills & Requirements

Technical Skills

SnowflakeAwsAzurePythonSqlTerraformKubernetesDockerCi/cd pipelinesInfrastructure as codeData observabilityData qualityBi toolsProblem-solvingCollaborationCommunicationSystems thinkingArchitectural mindsetSnowpro coreSnowpro advancedData engineeringCloud computingData warehousingStreaming architectures

Employment Type

FULL TIME

Level

senior

Posted

4/20/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.