Dice is the leading career destination for tech experts at every stage of their careers. Our client, Trebecon LLC, is seeking the following. Apply via Dice today!
Job Title: Senior Data Engineer
Location: Denver, CO (Onsite)
Job Summary
We are seeking a highly skilled Senior Data Engineer to design, build, and optimize scalable cloud-based data platforms and pipelines. The ideal candidate will have strong expertise in AWS data services, Databricks, Snowflake, and modern data engineering practices for enterprise-scale analytics, data warehousing, and real-time processing environments.
This role requires hands-on experience developing robust ETL/ELT pipelines, implementing data lake and data warehouse architectures, and ensuring high standards for data quality, testing, and operational excellence.
Key Responsibilities
- Design, develop, and maintain scalable batch and real-time data pipelines on AWS and Databricks platforms.
- Build and optimize enterprise data lake and data warehouse solutions using Redshift, Snowflake, Delta Lake, and Apache Iceberg.
- Develop ETL/ELT workflows using Python, SQL, Spark, and cloud-native technologies.
- Work with AWS services including S3, Step Functions, EventBridge, CloudWatch, Glue, Lambda, Kinesis, and EMR.
- Implement and manage data governance and metadata solutions using Unity Catalog and Glue Catalog.
- Create performant data models and dimensional schemas to support analytics and reporting needs.
- Integrate streaming and event-driven architectures using Kafka and AWS streaming services.
- Collaborate with cross-functional teams including Data Analysts, Architects, DevOps, and Business Stakeholders.
- Ensure high code quality through unit testing, integration testing, and detailed testing documentation.
- Build and maintain CI/CD pipelines and version control processes using Git and automation tools.
- Support Infrastructure as Code (IaC) practices using Terraform or similar technologies.
- Monitor, troubleshoot, and optimize data workflows for reliability, scalability, and performance.
- Participate in architecture discussions and recommend best practices for modern data engineering solutions.
Technical Skills
Required Skills & Qualifications
- Strong hands-on experience with AWS data ecosystem:
- Redshift
- S3
- Step Functions
- EventBridge
- CloudWatch
- AWS Glue
- Lambda
- Kinesis
- EMR
- Expertise in Databricks technologies:
- Apache Spark
- Delta Lake
- Apache Iceberg
- Unity Catalog
- Strong experience with Snowflake data platform.
- Advanced SQL and Python programming skills.
- Experience building batch and real-time data processing pipelines.
- Strong understanding of data warehousing concepts and dimensional modeling.
- Experience with CI/CD implementation and Git-based development workflows.
- Familiarity with Infrastructure as Code tools such as Terraform.
- Experience with orchestration and open-source tools such as Apache Airflow and dbt is a plus.
- Knowledge of streaming technologies including Kafka/MSK is preferred.
Soft Skills
- Strong analytical and troubleshooting skills.
- Ability to manage priorities across multiple projects simultaneously.
- Excellent organizational and communication skills.
- Strong focus on code quality, testing, and documentation.
- Ability to work effectively in a collaborative onsite environment.