Senior Data Engineer (Azure + Databricks Lakehouse)

BrickRed Systems
Washington, US
Remote

Job Description

We are seeking a highly skilled Senior Data Engineer to join a high-performing data engineering team focused on building scalable data pipelines, integrations, and enterprise-grade data products that support analytics and operational use cases.

This role requires deep expertise in Azure cloud data engineering within a Databricks Lakehouse environment. The ideal candidate will collaborate with engineers, architects, analysts, and product managers to design and implement robust, scalable, and high-performance data solutions.

You will work with minimal supervision, exercise strong technical judgment, and proactively recommend and implement solutions aligned with business and technology goals.

Key Responsibilities

Data Engineering & Technical Delivery

  • Design, develop, and maintain scalable data pipelines and ETL/ELT processes using PySpark and SparkSQL
  • Build and manage orchestration workflows using Azure Data Factory
  • Work with Azure Data Lake and cloud-based storage systems for large-scale data processing
  • Implement streaming solutions using Kafka and/or Azure Event Hub
  • Optimize data pipelines for performance, scalability, reliability, and cost efficiency
  • Apply best practices for data partitioning, indexing, and storage formats such as Parquet
  • Analyze DAGs and system performance to identify bottlenecks and improve efficiency
  • Implement and maintain robust CI/CD pipelines using Azure DevOps

System Design & Architecture

  • Contribute to data architecture design including HLDs, LLDs, and data models
  • Understand system interactions, dependencies, and cross-platform data flows
  • Build end-to-end data solutions across ingestion, transformation, and consumption layers
  • Apply distributed computing concepts such as fault tolerance, idempotency, and scalability
  • Identify opportunities to automate and optimize existing data processes

Cross-Functional Collaboration

  • Partner with engineering, analytics, and product teams to deliver scalable technical solutions
  • Translate business requirements into technical designs and implementation strategies
  • Lead technical discussions and contribute to sprint planning and solution design
  • Mentor junior engineers and support overall team development

Code Quality, Testing & Documentation

  • Write clean, maintainable, and efficient code aligned with engineering standards
  • Conduct code reviews and ensure adherence to best practices
  • Develop and review unit tests and test plans
  • Maintain technical documentation including architecture diagrams and process documentation
  • Perform root cause analysis (RCA) and implement quality improvements

Project & Delivery Management

  • Deliver assigned modules and user stories within timelines
  • Support effort estimation, sprint planning, and release management activities
  • Monitor delivery progress and ensure compliance with engineering standards
  • Participate in deployment and production support processes

Innovation & Continuous Improvement

  • Design and implement modern data engineering solutions and frameworks
  • Evaluate emerging technologies and explore AI/ML and Agentic AI use cases
  • Continuously improve systems for performance, scalability, and maintainability
  • Operate effectively in fast-paced and evolving environments

Communication & Leadership

  • Create clear technical documentation and presentations for stakeholders
  • Communicate architecture decisions, implementation strategies, and technical processes
  • Mentor engineers and contribute to knowledge-sharing initiatives
  • Collaborate with stakeholders to clarify requirements and present solutions

Required Skills & Qualifications

Technical Skills

  • Strong hands-on experience with the Azure Data Engineering ecosystem, including:
  • Azure Data Factory
  • Azure Data Lake
  • Azure DevOps (CI/CD)
  • Proficiency in:
  • SQL (T-SQL, PostgreSQL)
  • PySpark
  • SparkSQL
  • Experience with:
  • Databricks Lakehouse architecture
  • Kafka and/or Azure Event Hub
  • Parquet and modern data storage formats
  • Strong understanding of:
  • Data partitioning and indexing
  • Distributed computing principles
  • Performance tuning and optimization of data pipelines

Professional Skills

  • Strong analytical and problem-solving abilities
  • Ability to work independently with minimal supervision
  • Excellent communication and documentation skills
  • Experience working in Agile environments (Scrum/Kanban)
  • Ability to manage multiple priorities in fast-paced environments

Preferred Qualifications

  • Experience designing end-to-end data platforms or lakehouse architectures
  • Exposure to AI/ML or Agentic AI applications
  • Prior experience mentoring or leading engineering teams
  • Relevant Azure or Data Engineering certifications

Performance Expectations

  • Deliver high-quality, scalable, and maintainable solutions
  • Adhere to coding standards and engineering best practices
  • Reduce defects and improve system performance
  • Contribute to team knowledge sharing and continuous improvement initiatives

About Brickred Syst

Skills & Requirements

Technical Skills

PysparkSparksqlAzure data factoryAzure data lakeKafkaAzure event hubAzure devopsParquetDagsCi/cdTechnical judgmentCollaborationCode qualityTestingDocumentationCommunicationLeadershipData engineeringEtl/eltStreamingOrchestrationCi/cd

Employment Type

FULL TIME

Level

senior

Posted

5/7/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.