Title: Cloud Data Platform Engineering Lead
Note: This position follows a hybrid work model, requiring 2 days per week on-site at our corporate office 20 S Wacker Dr, Chicago, IL 60606
The first preference for this role is given to local candidates in the Chicago area.
Job Summary
We are seeking a Senior Staff Engineer to architect and drive enterprise data solutions for CME on the Google Cloud Platform (GCP). This role requires a deep understanding of the end-to-end data ecosystem, including operational and analytical stores, AI/ML integration, data lifecycle management, compliance, cost optimization, and modern cloud-native tools. As a principal technical leader, you will provide the architectural vision and drive complex implementations, ensuring a seamless alignment between technical data strategies and overarching business objectives.
What You’ll Get
- A supportive environment fostering career progression, continuous learning, and an inclusive culture.
- Broad exposure to CME's diverse products, asset classes, and cross-functional teams.
- A competitive salary and comprehensive benefits package. Explore our full range of benefits .
What You’ll Do
- Architect Enterprise Solutions: Lead the design, architecture, and development of massive, enterprise-scale data solutions using the GCP suite (BigQuery, Dataflow, Cloud Storage, Pub/Sub, and Vertex AI) with a focus on performance and security.
- Champion Cloud-Native Mastery : Drive the adoption of sophisticated workflows utilizing Kubernetes, Terraform, Kubernetes Configuration Controller (KCC), Argo Workflows, and CI/CD frameworks.
- Integrate at Scale : Collaborate with cross-functional teams to bridge data workflows with operational and analytical stores, ensuring absolute system interoperability and reliability.
- Innovate and Future-Proof : Constantly scout and adopt emerging GCP services and modern technologies to enhance our data capabilities and insulate the architecture against future shifts.
- Govern with Integrity: Define and enforce robust strategies for the entire data lifecycle—from retention to disposal—strictly adhering to governance and regulatory standards.
- Optimize the Platform : Spearhead enterprise-wide initiatives to optimize cloud costs, maximizing resource efficiency and eliminating waste without compromising high-performance metrics.
- Set the Gold Standard : Establish best practices for data quality, metadata management, and lineage tracking to maintain a bulletproof data governance framework.
- Mentor Engineering Talent : Provide principal-level technical mentorship, establishing coding best practices with a primary focus on robust Java development and Python to drive architectural excellence.
- Engineer High-Throughput Pipelines : Architect and implement resilient data pipelines using Apache Spark, Apache Flink, and Google Dataflow, heavily leveraging Java frameworks and APIs.
- Align with the Enterprise : Utilize a deep understanding of the SDLC and application stacks to ensure data initiatives are perfectly synchronized with broader enterprise systems.
What You’ll Bring
- Academic Foundation : Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related technical field.
- Architectural Depth : 10+ years of progressive software and data engineering experience, including 5+ years of hands-on experience designing large-scale GCP architectures.
- Expert Programming : Expert-level proficiency in Java and extensive hands-on experience, coupled with strong proficiency in Python or similar languages.
- Cloud-Native Expertise : Mastery of Kubernetes, Terraform, KCC, Argo Workflows, and CI/CD frameworks to guide implementation strategies.
- GCP Specialization : Advanced knowledge of BigQuery, Cloud SQL, IAM, KMS, GKE, GCS, Pub/Sub, and Vertex AI .
- Framework Fluency : Hands-on experience with large-scale processing frameworks like Apache Spark or Apache Flink, specifically within the Java ecosystem.
- Analytical & Semantic Insight : A robust understanding of operational/analytical data stores and semantic layer design using tools like Looker or Tableau.
- Governance Mastery : Strong knowledge of SDLC processes, data security, and comprehensive data lifecycle management.
- Preferred Certifications : Google Cloud Professional Data Engineer or Cloud Architect certification is highly valued.
Key Competencies
- Analytical Excellence : Relentless focus on scalability, fault-tolerance, and technical efficiency in solving complex problems.
- Strategic Communication : Ability to engage effectively with both deep-technical engineering teams and high-level business stakeholders.
- Influence Without Authority : Proven ability to guide and mentor cross-functional teams to deliver strategic technical outcomes without direct reporting lines.
- Drive Engagement : A commitment to fostering a culture of technical rigor and continuous learning across the engineering organization.
#LI-DS2
CME G