Snowflake Data Architect with Cortex

Capgemini
New York, US
On-site

Job Description

Position Title : Snowflake Data Architect with Cortex

Location : New York, NY (Onsite/Hybrid)

Experience : 8+ Years

Employee Type : Full Time with Benefits

Job Description

We are seeking a highly experienced Snowflake Data Architect with Cortex to lead the design and evolution of modern, cloud‑native data platforms and AI‑driven analytics solutions. This role will own end‑to‑end data architecture, ensuring scalable, secure, and high‑performance solutions that enable advanced analytics, generative AI, and machine learning use cases.

The ideal candidate brings a strong blend of hands‑on Snowflake architecture expertise, deep knowledge of Snowflake Cortex AI capabilities, and the ability to collaborate effectively with business, analytics, and engineering teams to deliver high‑impact, data‑driven outcomes.

Required Qualifications

  • 8+ years of experience in data architecture, data engineering, or data platform roles with at least 4 years specifically designing and optimizing solutions on Snowflake
  • Deep expertise in Snowflake features including warehouses, resource monitors, zero‑copy cloning, Time Travel, data sharing, Snowpark, Tasks/Streams, and security/governance controls
  • Strong proficiency in SQL, data modeling (conceptual, logical, physical), ETL/ELT patterns, and cloud data platforms (AWS, Azure, or GCP)
  • Proven experience designing secure, scalable architectures for analytics, reporting, and machine learning workloads
  • Solid understanding of data governance, quality, lineage, and compliance (e.g., GDPR, SOC2, HIPAA if applicable)
  • Bachelor’s or Master’s degree in computer science, Information Systems, Engineering, or a related field, or equivalent experience
  • Excellent communication, stakeholder management, and leadership skills

Key Responsibilities

  • Lead end‑to‑end data architecture and solution design for Snowflake‑based data platforms including logical/physical data models, ingestion patterns (batch, streaming, Snowpipe), storage layers (raw, curated, consumption/semantic), and consumption patterns for analytics, BI, and AI/ML
  • Design and optimize scalable, high‑performance data warehouses, data lakes, and lakehouse architectures on Snowflake focusing on performance tuning, query optimization, cost management, workload/warehouse strategies, and autoscaling
  • Architect and implement AI‑powered solutions using Snowflake Cortex including Cortex LLM functions, Cortex Search for semantic/vector/RAG capabilities, Cortex Analyst for conversational analytics, Document AI, and integration with external LLMs (e.g., for fine‑tuning, agents, and multimodal data processing)
  • Define and enforce data governance, security, and compliance frameworks including RBAC, row/column access policies, dynamic data masking, encryption, secure data sharing, and private listings
  • Design data pipelines integrating with various sources (on‑prem, cloud, SaaS) and orchestration tools; implement real‑time capabilities using Streams, Tasks, and Snowpark (Python/Scala/Java)
  • Collaborate with data engineers, analysts, scientists, and business stakeholders to deliver governed, reusable data products that accelerate analytics and AI initiatives
  • Provide technical leadership in migrations to Snowflake from legacy systems (e.g., on‑prem warehouses, other clouds) and establish reference architectures, patterns, and standards
  • Monitor platform health, optimize for cost/performance, and implement disaster recovery, replication, and high‑availability strategies
  • Mentor junior architects and engineers, conduct design reviews, and promote best practices in data modeling (e.g., Data Vault, Kimball, or hybrid semantic modeling) and AI‑ready data foundations

Preferred Skills and Experience

  • Hands‑on experience with Snowflake Cortex AI capabilities including Cortex Search, Cortex Analyst, LLM functions, vector embeddings, RAG patterns, and building AI agents or applications within Snowflake
  • SnowPro Core, Advanced, or Architect certifications
  • Experience with modern data stack tools such as dbt, Airflow, Kafka, Spark, Fivetran, Matillion, or similar
  • Knowledge of AI/ML workflows, vector databases, semantic search, and integrating structured and unstructured data for generative AI
  • Background in Data Vault 2.0, dimensional modeling, or domain‑driven design
  • Experience in regulated industries (finance, healthcare, pharma) or large‑scale enterprise environments is a plus

If your Interested, Kindly, share us your resume on abuzar.umar@capgemini.com

Life At Capgemini

Capgemini supports all aspects of your well-being throughout the changing stages of your life and career. For eligible employees, we offer:

  • Flexible work
  • Healthcare including dental, vision, mental health, and well-being programs
  • Financial well-being programs such as 401(k) and Employee Share Ownership Plan
  • Paid time off and paid holidays
  • Paid parental leave
  • Family building benefits like adoption assistance, surrogacy, and cryopreservation
  • Soci

Skills & Requirements

Technical Skills

Snowflake architectureData modelingEtl/elt patternsCloud data platformsData governanceData qualityData lineageData complianceSqlSnowflake cortex aiSnowparkStreamsTasksPythonScalaJavaCommunicationStakeholder managementLeadershipAiMlData engineeringData platform

Salary

$90,786 - $110,358

year

Employment Type

FULL TIME

Level

senior

Posted

4/28/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.

Sign in and we'll score your resume against this role.