ConvergeHEALTH - Data Operations Engineer, Expert Services-Innovation_Delivery_Transformation

Deloitte
Seattle; Washington, US
On-site

Job Description

As a Data Operations Engineer on Converge for Healthcare's Expert Services team, you will play a hands-on technical role connecting client source data to the foundational data models powering Deloitte's Data Studio platform - a growing portfolio of healthcare provider analytics products including Revenue Intellect, Care Intellect, SMarT Rapid Analytics, and Supply Chain Intellect.

In this role, you will work at the intersection of data engineering, cloud platform operations, and applied AI - designing and operating the cloud-native data pipelines that turn messy, real-world healthcare data into reliable, decision-ready analytics. You will work across both subscription-based product delivery and Deloitte Consulting engagements where Data Studio is embedded as a core enabler, partnering primarily with engineering, data, and product teams, and occasionally engaging directly with client data teams to resolve integration challenges.

This position is well suited for engineers who enjoy building durable data systems, working through ambiguity in real-world data, and applying emerging AI tooling to push the ceiling on what a small team can deliver - within a rapidly evolving healthcare analytics product ecosystem.

Recruiting for this role ends on 05/21/2026.

Work you'll do

As a Data Operations Engineer on Converge for Healthcare's Expert Services team, you will be responsible for:

  • Data integration & pipeline engineering. Design, build, and optimize cloud-native ETL/ELT pipelines that ingest client source data and conform it to the Data Studio platform's foundational data model - making real-world healthcare data ready to power production analytics.
  • Data validation, profiling & quality. Profile, validate, and QA large, complex healthcare datasets for accuracy, completeness, and conformance to platform standards; combine traditional debugging with LLM-enabled data exploration and ML-based anomaly detection to find and resolve issues faster than manual approaches allow, partnering with client and Deloitte teams as needed when integration issues require it.
  • Analytics & insight enablement. Develop the analytics layer of the Data Studio platform - including BI dashboards, self-service reporting, and ML Lab workflows - putting validated, production-ready data in the hands of consulting teams and clients.
  • Automation & orchestration. Implement and maintain workflow automation, monitoring, and alerting using event-driven architectures and orchestration tools, with the goal of building systems that run reliably without constant intervention.
  • Product collaboration & solution evolution. Act as a hands-on technical voice into the Data Studio platform's evolution - translating real-world delivery learnings into concrete product, data model, and platform enhancement opportunities, and partnering with product and engineering teams to validate and pressure-test new capabilities before they ship.

A strong successful candidate will possess these skills:

  • Expert SQL proficiency, including complex query authoring, data profiling, performance tuning, and query optimization across large-scale, messy datasets
  • Strong Python proficiency for data wrangling, scripting, automation, and integrating ML/AI capabilities into data pipelines
  • Hands-on experience designing and operating cloud-native data pipelines, with judgment around when to use which tool and how to debug distributed systems when things break; practical familiarity with AWS data services (e.g., Redshift, Glue, S3, Step Functions, Lambda) and exposure to AWS AI/ML services (e.g., Bedrock, SageMaker) a plus
  • Sound data modeling judgment, including conforming heterogeneous source data to standardized analytics models without losing fidelity
  • Demonstrated experience working with large, complex datasets across structured, semi-structured, and unstructured formats
  • Forward-thinking engineering mindset, including fluency with modern code collaboration workflows (Git, pull requests, code review), practical use of AI-assisted development tools (e.g., Claude Code, GitHub Copilot), and curiosity about emerging AI/ML techniques such as agentic patterns, RAG, and vector databases
  • Working familiarity with modern BI tools (e.g., Tableau, Power BI, Superset) and workflow orchestration platforms (e.g., Airflow, Step Functions)
  • Strong ownership mindset and comfort with ambiguity - able to self-manage priorities, juggle concurrent workstreams, and adapt as priorities shift
  • Clear communicator who works well across distributed engineering, product, and occasional client or consulting stakeholders, including across international time zones
  • Awareness of Responsible and Trustworthy AI principles, including data privacy, bias mitigation, and governance in AI-driven workflows
  • Working knowledge of healthcare data formats and interoperability standards (e.g., claims, remittances, EMR data, HL7, FHIR, X12 EDI), with practical experience handling their quirks, version difference

Skills & Requirements

Technical Skills

SqlPythonCloud-native data pipelinesAws data servicesEtl/elt pipelinesBi dashboardsSelf-service reportingMl lab workflowsWorkflow automationMonitoringAlertingEvent-driven architecturesOrchestration toolsHealthcare data formatsInteroperability standardsCommunicationTeamworkProblem-solvingMentorshipHealthcareData engineeringCloud platform operationsAi

Employment Type

FULL TIME

Level

Mid-Level

Posted

4/30/2026

Apply Now

You will be redirected to Deloitte's application portal.

Sign in and we'll score your resume against this role.