Senior Data Engineer (Python) Contract

Marc Ellis Consulting
AE
On-site

Job Description

Location: Dubai, United Arab Emirates

Engagement: Contractual

Duration: 12 Months (Extendable, Long-term with Marc Ellis)

Role Overview:

We are looking for a talented and driven Python Development Engineer to join our growing technology team. You will sit at the intersection of backend engineering and data, owning the design and delivery of high-performance APIs, microservices, and data transformation pipelines that power our data-driven applications. This is not a generalist web development role. The right candidate brings strong Python fundamentals, a production mindset around performance and security, and the ability to work confidently across the full backend stack from async API design to ETL pipelines and database optimisation.

Education:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.

Key Technical Experience:

  • Python: Strong command of Python — OOP, async programming, type hints, testing, and code quality standards.
  • API Frameworks: Hands-on production experience with Flask and FastAPI. Familiarity with Django or Pyramid is a plus.
  • Async & Concurrency: Confident using asyncio and async/await patterns to build responsive, concurrent services.
  • Security: Practical knowledge of OAuth 2.0, JWT, API gateway configuration, and secure microservice patterns.
  • ORMs & Databases: Experience with SQLAlchemy or Django ORM; proficient in both relational (PostgreSQL, MySQL) and NoSQL databases.
  • Data Engineering: Familiarity with ETL design, data pipeline tooling, and big data technologies such as Spark, Kafka, or Hadoop.
  • CI/CD & DevOps: Experience building and maintaining CI/CD pipelines (e.g., GitHub Actions, GitLab CI, Jenkins) and productionising services.
  • Scripting: Proficiency in SQL and shell scripting for data tasks and automation.
  • Version Control: Strong Git practice: branching, merging, pull requests, and collaborative workflows.
  • Problem-Solving: Analytical thinker who diagnoses root causes, not just symptoms. High attention to detail.
  • Communication: Writes clearly, speaks confidently, collaborates well across engineering and data teams.
  • Cloud & Containers (Preferred): Experience with AWS, Azure, or GCP and containerisation (Docker, Kubernetes) is an advantage.
  • Data Governance (Preferred): Awareness of data governance frameworks, compliance requirements, and data security practices.

Key Responsibilities:

Core Python & API Engineering

  • Build and maintain production-grade RESTful APIs using Flask and FastAPI as primary frameworks.
  • Write high-quality, asynchronous Python code using asyncio — managing concurrent connections and optimising I/O-bound workloads.
  • Architect and contribute to microservices-based systems, ensuring clean separation of concerns and maintainable codebases.
  • Apply Python best practices: type hints, code reviews, unit and integration testing, and clear documentation.

Security-First API Design

  • Design and implement secure APIs with authentication and authorisation flows using OAuth 2.0 and JWT.
  • Integrate and configure API gateways; enforce encryption and access control at the microservice level.
  • Stay current on evolving security practices and proactively identify vulnerabilities in API and service design.

Backend Systems & Data Access

  • Develop scalable backend systems that serve datasets through APIs and streaming interfaces.
  • Build and manage data access layers using ORMs (SQLAlchemy, Django ORM) — modelling data, handling CRUD operations, managing transactions.
  • Integrate with a range of relational and NoSQL databases; design schemas that balance normalisation with query performance.
  • Optimise backend services for throughput, latency, and reliability under production load.

Data Transformation & Pipeline Engineering

  • Design and implement ETL/ELT pipelines to ingest, clean, transform, and deliver data for downstream analytics and applications.
  • Enforce data integrity, quality, and consistency standards across all transformation workflows.
  • Work with big data tooling (Spark, Kafka, Hadoop) where data volumes and velocity demand it.
  • Collaborate with data and analytics teams to understand pipeline requirements and deliver timely, reliable data products.

DevOps, Quality & Collaboration

  • Own CI/CD pipelines for the services you build — from automated testing to deployment and monitoring.
  • Use Git for version control: branching strategy, pull requests, code review, and conflict resolution.
  • Debug and resolve issues systematically using logging, profiling, and testing frameworks.
  • Communicate clearly with stakeholders, write actionable documentation, and contribute positively to team standards.

Skills & Requirements

Technical Skills

PythonFlaskFastapiDjangoPyramidAsyncioJwtOauth 2.0SqlalchemyDjango ormPostgresqlMysqlNosqlEtlSparkKafkaHadoopCi/cdGitShell scriptingSqlCommunicationTeamworkProblem-solvingLeadershipDecision-makingBackend engineeringData engineeringApi designMicroservicesEtlData pipelinesBig dataCloudContainersData governance

Employment Type

CONTRACT

Level

senior

Posted

5/8/2026

Continue to Indeed

You will be redirected to the job posting on Indeed.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.