Data Engineer – IBM Quantum

IBM
Los Angeles, US
On-site

Job Description

Introduction

IBM Quantum is building the world’s leading quantum computing systems, software, and cloud services. The Data Engineer in this role will design and operate the data pipelines that power insight into quantum hardware performance, system reliability, user workloads, and platform operations. You will work closely with quantum hardware, firmware, cloud, and product teams to turn diverse technical datasets into trusted analytics assets that guide decision-making across IBM Quantum’s roadmap.

Your Role And Responsibilities

As a seasoned Data Engineer specializing in Data Integration, you will design and build solutions to transfer data from operational and external environments to the business intelligence environment. Your expertise will ensure the seamless flow of data throughout the business intelligence solution's lifecycle. Your primary responsibilities will include:

Design Data Integration Solutions: Create and implement Extract, Transform, and Load (ETL) processes to facilitate data transfer between environments

Develop ETL Processes: Build and maintain efficient ETL processes to ensure accurate and timely data flow, adhering to best practices and industry standards.

Ensure Seamless Data Flow: Monitor and troubleshoot data integration issues, collaborating with stakeholders to resolve problems and optimize data flow.

Optimize Data Integration Solutions: Continuously evaluate and improve data integration solutions, identifying opportunities for process improvements and efficiency gains.

Preferred Education

Master's Degree

Required Technical And Professional Expertise

Design, build, and maintain scalable, reliable data pipelines supporting analytics, operational dashboards, and hardware performance insights for IBM Quantum systems.

Contribute towards building IBM Quantum’s Lakehouse by implementing scalable data connectors.

Develop and operate ETL/ELT workflows and tooling with a focus on data quality, accuracy, timeliness, and continuous improvement.

Apply advanced SQL skills using PostgreSQL and Presto to support analytical workloads, including complex queries and performance tuning.

Build and operate orchestration workflows in Apache Airflow, including dependency management, retries, backfills, monitoring, and operational reliability.

Implement data transformations and validations using Python (e.g., pandas and related libraries).

Support large-scale batch processing for high-volume, heterogeneous datasets, including system telemetry, experiment metadata, cloud operations data, and device performance metrics.

Work with streaming platforms such as Apache Kafka or IBM Event Streams to consume event-driven data from distributed quantum systems and services.

Apply streaming architecture concepts including topics, partitions, consumer groups, and schema evolution.

Integrate multiple technical data sources—quantum hardware telemetry, calibration data, experiment logs, job execution data, user activity, system health metrics—into trusted analytical datasets.

Collaborate with quantum hardware, software, product, SRE, and analytics teams to translate requirements into robust, production-ready data solutions.

Use Git-based version control, contribute via code reviews, and follow industry-standard software engineering best practices.

Preferred Technical And Professional Experience

Experience with Lakehouse solutions and architectures, including IBM watsonx.data

Experience with distributed analytics engines such as Presto/Trino, or Apache Spark

Familiarity with data modeling techniques for analytical and reliability engineering use cases.

Exposure to data governance concepts such as access control, dataset ownership, lineage, and lifecycle management.

Experience operating data pipelines in cloud-based or distributed environments (e.g., hybrid cloud, containerized systems).

Experience working with hardware telemetry, infrastructure monitoring data, or high-volume operational datasets.

Interest in or exposure to quantum computing, advanced hardware systems, cryogenics, or other deep-technology platforms.

Skills & Requirements

Technical Skills

EtlSqlPythonPandasApache airflowKafkaCollaborationProblem-solvingQuantum computingData engineering

Employment Type

FULL TIME

Level

mid

Posted

4/24/2026

Apply Now

You will be redirected to IBM's application portal.