Junior Data Engineer Chicago, Illinois

Superiorinsurancepartnersllc
Chicago, US

Job Description

Superior Insurance Partners is a rapidly growing insurance brokerage platform, focused primarily on providing commercial lines, personal lines, and employee benefit solutions to companies and individuals. Superior acquires and partners with leading independent insurance agencies primarily in the Midwest and Eastern US. The company’s mission is to improve the lives of its agency partners. Superior does this by creating a highly tailored plan for each of its agency partners to help them achieve their goals, and providing customized resources including accounting/finance, recruiting, HR, AMS/IT, marketing, and M&A support. Agency partners are aligned through long‑term economic incentives while leveraging the benefits of best practices, scale, and resources across Superior’s shared platform.

Superior is backed by Tyree & D'Angelo Partners (\"TDP\"), a leading Chicago-based private equity firm that makes control ownership investments in, and partners with, lower middle market businesses with the goal of creating meaningful value for all involved. TDP is currently investing out of its third fund and has managed and created over $3 billion of capital and company enterprise value. TDP has significant experience investing in service businesses and has completed over 1000 investment partnerships in its history.

Role Overview

We are hiring a Junior Data Engineer / Analytics Engineer to support and scale our data platform. This role focuses on building, maintaining, and improving data ingestion and ETL pipelines that connect third‑party systems (including insurance agency management & ERP systems) into our Azure‑based analytics environment.

This is a hands‑on, backend‑focused role. The successful candidate will own the operational reliability of data pipelines and help modernize existing ETL processes to be more efficient, standardized, and scalable.

Key Responsibilities

  • Build, maintain, and enhance Azure Data Factory (ADF) pipelines
  • Own day‑to‑day reliability of data ingestion and ETL processes
  • Troubleshoot pipeline failures, performance issues, and data anomalies
  • Refactor and improve existing ETL processes to increase efficiency, reliability, and maintainability
  • Reduce manual data movement through automation and standard patterns
  • Support schema changes, incremental loads, and new data elements

Data Source Integration (APIs & Third‑Party Systems)

  • Integrate data from third‑party systems such as:
  • Insurance agency management systems (e.g., Applied EPIC, Vertafore AMS360, or similar)
  • CRM and ERP platforms
  • File‑based sources (CSV, Excel, SFTP)
  • REST‑based APIs
  • Work with API documentation to authenticate, extract, and ingest data
  • Handle pagination, rate limits, and incremental data extraction
  • Assist with vendor‑provided APIs or SDKs when available

Data Quality, Validation & SQL

  • Write SQL queries to validate data completeness and accuracy
  • Reconcile ETL outputs to source system reports and extracts
  • Identify and resolve data quality issues
  • Assist with performance tuning, indexing, and schema cleanup

Operational Foundations & Documentation

  • Create and maintain runbooks for data pipelines and ingestion processes
  • Document ETL logic, assumptions, and known limitations
  • Follow and reinforce standards for naming, structure, and deployment
  • Support change tracking and controlled releases

Required Qualifications

  • Understanding of ETL concepts and data pipelines
  • Hands‑on experience or strong exposure to Azure Data Factory or similar ETL tools
  • Experience working with APIs (REST), including reading documentation and handling JSON payloads
  • Familiarity with cloud‑based databases such as Azure SQL, SQL Server, or similar
  • Strong attention to detail and data accuracy
  • Ability to document work & processes and follow defined standards

Preferred Qualifications

  • Experience with insurance systems such as Applied EPIC or Vertafore AMS360
  • Exposure to Power BI datasets or backend reporting models
  • Experience improving or refactoring existing ETL pipelines
  • Familiarity with basic authentication methods (API keys, OAuth, tokens)

This role offers a clear growth path into a senior data engineering or platform‑focused position. As the data platform matures, this individual will take on greater ownership of ingestion architecture, standardization across new data sources, and mentorship of future hires.

$70,000.00 to $85,000.00

#J-18808-Ljbffr

Skills & Requirements

Technical Skills

ETLAzure Data Factory (ADF)data ingestiondata pipelinesdata qualityvalidationSQLAPIsthird-party systemsdata anomaliesschema changesincremental loadsnew data elementsauthenticationAPI keysOAuthtokensPower BI datasetsbackend reporting modelsETL pipelinesperformance tuningindexingschema cleanupnamingstructuredeploymentchange trackingcontrolled releasesdata engineeringdata platform

Salary

$70,000+

year

Level

mid

Posted

4/13/2026

Apply Now

You will be redirected to Superiorinsurancepartnersllc's application portal.