Sr. AWS Data Engineer (Data bricks & Snowflake) -- Onsite -- W2 Profiles

Jobs via Dice
Denver, US
On-site

Job Description

Dice is the leading career destination for tech experts at every stage of their careers. Our client, Trebecon LLC, is seeking the following. Apply via Dice today!

Role: AWS Data Engineer (Databricks & Snowflake)

Location: Denver, CO – Onsite – F2F Interview

Required Skills:

  • Strong hands-on experience with AWS services:
  • Redshift
  • S3
  • Step Functions
  • EventBridge
  • CloudWatch
  • Glue
  • EMR
  • Lambda
  • DynamoDB
  • DMS
  • Kinesis
  • Strong experience with Databricks:
  • Apache Spark
  • Delta Lake
  • Apache Iceberg
  • Unity Catalog
  • Strong experience with Snowflake
  • Strong SQL and Python programming skills
  • Experience with CI/CD pipelines and Git
  • Familiarity with Infrastructure as Code tools such as Terraform
  • Strong understanding of Data Warehousing and Dimensional Modeling concepts
  • Experience with Apache Airflow and dbt
  • Experience building batch and real-time streaming pipelines
  • Strong testing, debugging, and documentation skills

Responsibilities:

  • Design and develop scalable data pipelines on AWS Cloud platform
  • Build and maintain ETL/ELT workflows using Databricks, Spark, and AWS services
  • Develop real-time and batch data processing solutions
  • Create and optimize Data Lake and Data Warehouse architectures
  • Work with Snowflake, Redshift, and S3-based data platforms
  • Implement data governance and metadata management using Unity Catalog
  • Develop and maintain CI/CD pipelines for data engineering workflows
  • Monitor and troubleshoot data pipelines using CloudWatch and EventBridge
  • Collaborate with cross-functional teams including Data Analysts, Architects, and Business stakeholders
  • Ensure data quality, performance optimization, and security best practices
  • Create detailed technical documentation and testing artifacts

Preferred Skills:

  • Experience with Kafka/MSK streaming platforms
  • Experience with Spark using Scala or Python
  • Exposure to Infrastructure Automation and DevOps practices
  • Experience with healthcare, finance, or enterprise-scale data platforms

Skills & Requirements

Technical Skills

AwsDatabricksSnowflakeSqlPythonCi/cdTerraformData warehousingDimensional modelingApache airflowDbtCollaborationDocumentationData engineeringCloud

Employment Type

FULL TIME

Level

senior

Posted

5/8/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.