Analytics Engineer

Loenbro
Dallas, US
Hybrid

Job Description

Job Title: Analytics Engineer

Company: Loenbro, LLC

Business Unit/Department: Information Technology

Location: Centennial, CO; Dallas, TX; Westminster, CO

Reports to: Senior Manager – Business Analytics

Employment Type: Full-Time, Hybrid

FLSA Classification: Exempt

About Loenbro

Loenbro is a trusted, long-term construction lifecycle partner to thousands of customers across the U.S. Our market spans all industries and our service offerings include Critical Electrical, Mechanical & Structural, Soft Crafts, Inspection, Underground Maintenance and Installation, and Fabrication. Our expertise lies in simplifying the complex and establishing long-standing relationships with our partners. We have a national presence but a local approach—every customer benefits from our capabilities and our care.

At Loenbro, we don’t just offer jobs—we build careers grounded in integrity, teamwork, excellence, and purpose. Join a team where your expertise is valued, your growth is supported, and your work helps maintain and enhance the critical infrastructure that powers communities across the nation.

Job Summary

The Analytics Engineer is a technical producer role at the foundation of Loenbro’s data platform. This person designs, builds, and maintains the data pipelines, models, and architecture that make reliable reporting possible. They work within Microsoft Fabric to move raw data through the medallion layers — from ingestion to clean, structured, business-ready models — and hand off trusted datasets to the BI team to build reports against.

Essential Job Responsibilities

  • Design, build, and maintain data pipelines that ingest and transform data from source systems into the Loenbro lakehouse on Microsoft Fabric
  • Integrate data from ERP systems, project management platforms, APIs, and other operational sources into the lakehouse
  • Enforce data quality standards — building validation logic, monitoring pipelines, and resolving data integrity issues before they surface in reports
  • Use the medallion architecture — structuring bronze, silver, and gold layers to ensure data is clean, consistent, and ready for consumption by the BI team
  • Build and maintain fact and dimension tables and semantic models that serve as the trusted, authoritative data layer for all reporting
  • Collaborate with the team to understand reporting requirements and ensure the data model supports the metrics and KPIs the business needs
  • Document data models, pipeline logic, and transformation rules so the team can understand, maintain, and extend the platform over time
  • Manage deployments using SDLC and ALM best practices — version control, environment promotion, and change management
  • Stay current on Microsoft Fabric capabilities and proactively recommend improvements to the data platform architecture

Minimum Qualifications

Required:

  • Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field
  • 3–5+ years of experience in data engineering, analytics engineering, or a closely related technical discipline
  • Demonstrated experience building and maintaining data pipelines in a production environment
  • Hands-on experience with medallion architecture (bronze/silver/gold layers) and lakehouse design
  • Microsoft Fabric — required; Lakehouses, Data Warehouses, Dataflows Gen2, Data Pipelines, PySpark Notebooks, and OneLake
  • SQL — required; complex queries, stored procedures, transformation logic, and performance tuning
  • Python or another scripting language — required; pipeline automation, data transformation, and custom tooling
  • ETL/ELT design patterns and data integration from multiple source systems
  • OneLake, Delta Lake, and Parquet-based storage formats
  • CI/CD practices and version control (Git/Azure DevOps) applied to data pipelines and deployments
  • SDLC and ALM process discipline — moving solutions responsibly from development through production
  • Strong command of fact and dimension table design — star schema, snowflake schema, and conformed dimensions
  • Experience building and publishing semantic models / datasets that serve as the trusted layer the BI team builds reports against
  • Data quality, lineage, and governance principles — understanding how to build trust in a data platform
  • DAX proficiency for semantic layer logic, calculated measures, and KPI definitions
  • Ability to optimize semantic models for performance, scalability, and reuse across multiple reports

Preferred:

  • Power Platform experience (Power Automate, Power Apps)
  • AI/ML exposure — integrating AI-assisted pipelines or LLM outputs into data workflows
  • Software development background in any additional language
  • Microsoft DP-700 (Fabric Data Engineer Associate) or DP-600 (Fabric Analytics Engineer) certification
  • Construction or heavy industry experience is a plus

Physical Demands and Work Environment

The physical demands and work environment described here are representative of those that must be met by an em

Skills & Requirements

Technical Skills

Microsoft FabricSQLPythonPower AutomatePower AppsAI/MLMicrosoft DP-700Microsoft DP-600Data EngineeringAnalytics Engineering

Employment Type

FULL TIME

Level

mid

Posted

4/11/2026

Apply Now

You will be redirected to Loenbro's application portal.