Introduction
At IBM Consulting UK FutureNow, you’ll build a career at the forefront of hybrid cloud and AI, working with leading clients across the public and private sectors.
You’ll collaborate with top industry professionals, gain hands on experience with cutting edge technologies, and deliver solutions that create real business impact. From day one, you’ll work on meaningful, high profile programmes that stretch your skills and accelerate your growth.
We invest heavily in you—supporting continuous learning, in demand skills development, and long term career progression. You’ll thrive in a flexible, inclusive environment that values curiosity, encourages reinvention, and recognises what makes you unique.
We Offer
- Tools and policies to support your work-life balance from flexible working approaches, sabbatical programs, paid paternity leave, maternity leave and an innovative maternity returners scheme
- More traditional benefits, such as 25 days holiday (in addition to public holidays), private medical, dental & optical cover, online shopping discounts, an Employee Assistance Program, life assurance and a group pension plan through salary sacrifice.
Your Role And Responsibilities
As a Senior Data Engineer within Data & Analytics Platforms, you will lead the design and delivery of scalable, reliable data engineering solutions that underpin analytics, data science, and AI use cases across client environments. You will work across modern data platforms, integrations, and pipelines, supporting a wide range of analytical and AI driven projects.
This role focuses on building the data foundations required for advanced analytics and AI, working closely with data scientists and client stakeholders to ensure data is accessible, trusted, and fit for purpose.
Core Responsibilities
- Lead the design and development of robust data pipelines and data platforms to support analytics and AI workloads
- Build and maintain batch and streaming data solutions, including ingestion, transformation, and orchestration
- Design appropriate data models using relational, dimensional, or other modelling techniques, selecting approaches based on the requirements of analytics, reporting, and data science use
- Collaborate across multiple teams and client stakeholders to design and deliver end to end data and AI solutions within the wider client technology
- Work across cloud based data platforms and services (e.g. AWS, Azure, or GCP)
- Design and deliver data solutions that are functionally correct and fit for purpose, while meeting non functional requirements such as security, performance, resilience, and maintainability
- Support and guide junior engineers through technical leadership and mentoring
- Contribute to platform and engineering standards, best practices, and continuous improvement across engagements
Preferred Education
Bachelor's Degree
Required Technical And Professional Expertise
- Strong experience with Python, SQL, and data engineering frameworks or tools
- Proven experience designing and delivering data pipelines and data platforms in a professional environment
- Hands on experience with cloud data services (e.g. Azure Data Factory, Databricks, Synapse, AWS Glue, Redshift, BigQuery, or equivalents)
- Experience working with relational and/or NoSQL databases
- Solid understanding of data modelling, integration patterns, and data quality principles
- Strong problem solving and communication skills, with experience working in multidisciplinary delivery teams
This role is subject to pre-employment screening in line with the UK Government’s Baseline Personnel Security Standard (BPSS). An additional range of Personal Security Controls referred to as National Security Vetting (NVS) may apply, this could include meeting the eligibility requirements for The Security Check (SC) or Developed Vetting (DV).
Preferred Technical And Professional Experience
- Familiarity with data pipeline orchestration tools (e.g., Apache Airflow, Luigi).
- Knowledge of data quality and metadata management practices.
- Understanding of data virtualisation and data federation techniques.
- Experience with big data technologies (e.g., Hadoop, Spark).
- Experience supporting analytics, data science, or machine learning use cases
- Exposure to streaming or event driven architectures (e.g. Kafka or equivalent)
- Experience with CI/CD, infrastructure as code, or platform automation
- Background delivering solutions in regulated or security constrained environments such as public sector or financial services