Middle Data Engineer

The Open Platform
Dubai, AE

Job Description

We are looking for a Middle Data Engineer to join our data platform team. You will help build and maintain data pipelines that power analytics for multiple products and teams, working closely with senior engineers. This role is suited for individuals early in their careers who aspire to become strong Data Engineers, gaining experience with modern data workflows, building data systems, and writing production-quality code.

Responsibilities:

Participate in the development and maintenance of data pipelines and data-related services.

Contribute to the existing codebase for shared tools and libraries.

Assist with upgrading data platform components and services.

Communicate with analysts to understand their data needs.

Example tasks for this role:

Extend an existing SQL-based pipeline with a new transformation needed by analytics.

Add a new data source to an existing ETL process under the supervision of a senior engineer.

Refactor a small Python script into a clearer, modular structure and add logging.

Help configure CI steps for linting and tests for a data repository.

Update documentation for a pipeline after changes in logic or schema.

Requirements:

Confident in communication and proactive in seeking clarification when needed.

Demonstrates responsibility and ownership, and proactively communicates in any challenges.

Comfortable working with IDEs and version control systems like Git.

Basic understanding of clean code principles and software delivery workflows.

Essential Python skills, covering language fundamentals and data structures.

Confident with SQL basics.

Regular and thoughtful use of AI tools.

Desired Skills:

Strong motivation to learn and grow in data engineering.

Knowledge of data engineering fundamentals(ETL, data modeling, data quality, and various storage systems).

Experience using Apache Airflow.

Familiarity with containerization and associated tools (e.g., Docker).

Exposure to cloud platforms (e.g., GCP).

Not mandatory, but will be a plus:

Basic experience with any cloud platform (GCP, AWS, or Azure).

Understanding of what Airflow or any other orchestration tool is used for.

Experience with Docker on a basic level (build, run, logs).

Experience with BI tools (Superset, Metabase, Power BI, etc.).

Any pet projects related to data: ETL scripts, dashboards, analytics for personal projects or hackathons.

Confident in communication and proactive in seeking clarification when needed.

Demonstrates responsibility and ownership, and proactively communicates in any challenges.

Comfortable working with IDEs and version control systems like Git.

Basic understanding of clean code principles and software delivery workflows.

Essential Python skills, covering language fundamentals and data structures.

Confident with SQL basics.

Regular and thoughtful use of AI tools.

Strong motivation to learn and grow in data engineering.

Knowledge of data engineering fundamentals(ETL, data modeling, data quality, and various storage systems).

Experience using Apache Airflow.

Familiarity with containerization and associated tools (e.g., Docker).

Exposure to cloud platforms (e.g., GCP).

Basic experience with any cloud platform (GCP, AWS, or Azure).

Understanding of what Airflow or any other orchestration tool is used for.

Experience with Docker on a basic level (build, run, logs).

Experience with BI tools (Superset, Metabase, Power BI, etc.).

Any pet projects related to data: ETL scripts, dashboards, analytics for personal projects or hackathons.

Skills & Requirements

Technical Skills

PythonSQLApache AirflowDockerGCPcommunicationresponsibilityownershipdata engineeringETLdata modelingdata quality

Level

junior

Posted

3/27/2026

Apply Now

You will be redirected to The Open Platform's application portal.