Data Engineering Manager

Geneva Trading
Chicago, US
On-site

Job Description

Founded in 1999, Geneva Trading is a premier global principal trading firm with strategically located offices in Chicago, Dublin, and London. Our relentless focus on trading excellence combined with technological innovation has equipped us with a best-in-class proprietary trading platform, enabling us to compete at the highest levels in the global markets. Rooted in a culture of integrity, collaboration, and an unwavering passion for progress, we foster an environment of personal and professional excellence. Our nimble organizational structure and entrepreneurial spirit attract top-tier talent with a passion for innovation, laying the foundation and driving our consistent success in the industry.

About Geneva Trading

Geneva Trading is a proprietary trading firm focused on high-frequency and algorithmic trading across global markets. Our edge depends not only on execution, but on the quality, completeness, and usability of the data that powers research, analytics, monitoring, and trading decisions.

Why This Role Matters

Every trading decision at Geneva is built on data. This role sits at the center of that — you will own the platforms that capture, normalize, store, and distribute market data to the systems and people who depend on it most. When your pipelines are fast, correct, and reliable, the entire firm operates with better information.

This is a hands-on engineering leadership role. You will spend a significant portion of your time writing production code, designing data systems, and solving complex engineering problems alongside a team of 2–3 engineers. We are looking for someone who leads by building — not by delegating from a distance.

What Success Looks Like

  • In the first 90 days, you have a working understanding of the data stack, have shipped meaningful improvements to at least one pipeline, and have built trust with the trading and research teams.
  • By six months, you are driving the technical roadmap for market data infrastructure, have improved reliability or performance in a measurable way, and are actively mentoring your team through code reviews and design discussions.
  • At one year, you own the market data platform end to end — ingestion through delivery — and are a trusted technical partner to stakeholders across the firm.

Key ResponsibilitiesMarket Data Pipeline Engineering

Own the end-to-end market data pipeline — from multi-venue ingestion through normalization to near-real-time delivery — with a focus on correctness, resilience, and recoverability.

  • Integrate direct feed capture alongside third-party vendor data
  • Build replay, recovery, and gap-detection capabilities
  • Ensure sequencing, validation, and data availability at the speed the business requires

Time-Series Architecture (KDB+/Q)

Design and operate KDB+/Q platforms for real-time and historical market data, supporting analytical workflows for trading and research.

  • Optimize schema design, partitioning strategies, and query performance
  • Manage data retention, lifecycle policies, and long-term maintainability

Data Distribution & Platform Integration

Design scalable, reliable data delivery to downstream consumers using streaming and messaging technologies.

  • Define data contracts and schemas that downstream teams can depend on
  • Balance real-time delivery with durability and replayability

Tooling, Libraries & Supporting Systems

Build the internal tooling and shared libraries that make your team and others more productive.

  • Develop validation, monitoring, replay, and analytics tools
  • Own supporting systems for reference data, configuration, and metadata

Technical Leadership & Production Ownership

Lead your team through hands-on contribution, design reviews, and code reviews. Set engineering standards by example.

  • Partner with trading and research teams to understand their data needs and translate them into platform improvements
  • Own the reliability of core data systems, including debugging production issues during market hours
  • Define monitoring, alerting, and data quality observability

Technology Stack

KDB+ / Q

Python

C / C++

Linux

Docker

Git / CI-CD

Binary market data protocols

Streaming / message bus platforms

Kernel-bypass / high-performance networking

Industry-standard messaging protocols (FIX, SBE)

Required Qualifications

  • 7+ years of hands-on data engineering or market data infrastructure experience — you are an active, practicing engineer who writes production code regularly
  • 3+ years leading engineering teams while remaining deeply technical — your references can speak to your code contributions as well as your leadership
  • Expert-level KDB+/Q — you write complex Q, optimize tick plant performance, and can debug production HDB issues independently
  • Strong, production-quality Python — well-tested, packaged, maintainable systems-level code, not scripting or glue
  • Verifiable experience building low-latency market data decoders for real exchange protocols — you will personally own decoder dev

Skills & Requirements

Technical Skills

Kdb+/qPythonLow-latency market data decodersReal exchange protocolsLeadershipCommunicationData engineeringMarket data pipeline engineeringTime-series architectureData distributionPlatform integrationToolingLibraries

Employment Type

FULL TIME

Level

senior

Posted

5/6/2026

Apply Now

You will be redirected to Geneva Trading's application portal.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.