Analytics Engineer – Data Quality Lead

Shake Shack
New York, US
On-site

Job Description

Our secret to leading the way in hospitality? We put our people first!

At Shake Shack, our mission is to Stand For Something Good in all that we do. From our teams to our neighborhoods, we're committed to always doing the right thing. As one of the fastest-growing hospitality brands, we're all about crafting unforgettable experiences for our guests. We offer endless learning opportunities and the chance to make a lasting impact on our business, restaurants, and communities. As a member of the #ShackFam, you’ll have access to hands-on mentorship, training, and growth potential, all in a fun and inclusive environment.

Join us and Be a Part of Something Good.

Job Summary

We are seeking an Analytics Engineer with a quality-first mindset to join our Data & Analytics team. This role is responsible for designing, building, and maintaining robust data models, pipelines, and analytics infrastructure across a broad, multi-domain portfolio, while simultaneously serving as the internal technical quality gate for a delivery model that includes our internal team and external service partners. The Analytics Engineer: Data Quality Lead bridges hands-on engineering with oversight and standards-setting, ensuring that what gets built is not only functional but reliable, documented, and trustworthy. This role operates within a modern data stack environment and is expected to leverage AI tooling as a core part of day-to-day workflow, accelerating both personal output and broader team capability.

The ideal candidate has 3+ years of analytics engineering experience with strong dbt and SQL proficiency, a track record of working in vendor or offshore delivery models, and the technical judgment to review others' code with precision and confidence. They understand that data quality is not a phase at the end of a project but a discipline embedded in every model, every test, and every deployment decision. They get energy from making systems more reliable, not just shipping their own work. They are genuinely AI-native, using tools like GitHub Copilot, Cursor, or Claude not occasionally, but as a primary accelerant. And they have the communication skills to hold service partners to a high standard while remaining a collaborative, trusted partner to the broader team.

Responsibilities:

Hands-On Build & Engineering

  • Design, develop, and maintain dbt models, SQL transformations, and data pipelines that produce clean, analytics-ready datasets supporting reporting, analysis, ML/AI, and strategic initiatives across multiple business domains
  • Build and optimize dimensional data models that enable self-service analytics and support advanced use cases including machine learning feature engineering and AI model training
  • Own high-complexity internal workstreams such as semantic layer definitions, cross-domain data models, and metrics standardization where internal technical ownership is critical
  • Support query performance optimization and data warehouse efficiency to reduce cost and improve end-user experience
  • Develop and maintain clear documentation of data models, business logic, and data lineage to promote transparency and enable knowledge sharing across the team

Technical Quality & Service Partner Oversight

  • Serve as the internal technical quality gate for service partner deliverables, reviewing pull requests and outputs against established data modeling standards, testing requirements, and documentation expectations
  • Use AI-assisted code review tooling to conduct scalable first-pass analysis of service partner code, focusing human judgment on highest-risk decisions and architectural patterns
  • Own and continuously improve the team's data observability posture by deploying and tuning monitoring tools (e.g., Elementary, re_data, Monte Carlo, or Soda) to detect anomalies, freshness failures, and quality regressions before they surface in dashboards or downstream systems
  • Build and enforce pre-deployment checklists and release gate criteria, including automated downstream impact assessments so no change ships without a known blast radius
  • Define and maintain data contracts between data producers and consumers, creating explicit, documented agreements about what each dataset guarantees to reduce silent failures and undocumented assumptions
  • Provide technical guidance and mentorship to service partner resources and extended team members, raising overall delivery quality across the ecosystem

Cross-Functional Collaboration & Stakeholder Partnership

  • Partner with the Business Analyst, Data Product Lead, and Product Manager to translate business requirements into scalable, well-scoped data solutions
  • Collaborate with Data Engineering to ensure reliable upstream pipelines and with analytics consumers across Operations, Finance, Marketing, and other business units to understand data needs and validate that delivered solutions drive intended outcomes
  • Act as a trusted technical voice in program and project delivery conversati

Skills & Requirements

Technical Skills

DbtSqlAi toolingGithub copilotCursorClaudeData qualityData modelsData pipelinesAnalytics infrastructureHospitality

Employment Type

FULL TIME

Level

Mid-Level

Posted

4/21/2026

Apply Now

You will be redirected to Shake Shack's application portal.