Data Engineer (Microsoft Fabric | Azure | Databricks) - RibbitZ

Taraki
AE
On-site

Job Description

Our client RibbitZ is looking for Data Engineer (Microsoft Fabric | Azure | Databricks) in Dubai.

Role Summary

We are seeking a Data Engineer with hands-on experience in Azure-based data platforms and Microsoft Fabric to build scalable data pipelines and analytics solutions.

The ideal candidate will have strong skills in PySpark, SQL, and modern data engineering frameworks, with exposure to lakehouse architecture, real-time data processing, and cloud-native tools.

Key Responsibilities

  • Develop and maintain ETL/ELT pipelines using Azure-native tools and frameworks.
  • Process structured and semi-structured data (JSON, XML, CSV, APIs).
  • Implement incremental data loading and transformation logic.
  • Ensure data quality, validation, and consistency across pipelines.
  • Work with Microsoft Fabric components including:
  • Eventstream
  • Eventhouse
  • Fabric Pipelines
  • Implement Lakehouse architecture using Medallion model (Bronze, Silver, Gold).
  • Utilize Delta Lake for efficient storage and processing.
  • Build data solutions using:
  • Azure Data Factory (ADF)
  • Azure Databricks
  • ADLS Gen2
  • Ingest data from multiple sources:
  • ERP / SAP systems
  • APIs and external data sources
  • Support cloud-based data integration and transformation.
  • Develop data transformation workflows using:
  • PySpark / Apache Spark
  • Optimize pipelines using:
  • Partitioning, caching, and efficient joins
  • Work on both batch and streaming data processing.
  • Build analytics-ready datasets for reporting and BI.
  • Apply dimensional modeling concepts.
  • Deliver curated datasets for downstream consumption.
  • Implement real-time data pipelines using:
  • Microsoft Fabric Eventstream
  • IoT / streaming data sources
  • Enable real-time dashboards and monitoring use cases.
  • Integrate data pipelines with Power BI dashboards.
  • Support business teams with clean and reliable datasets.
  • Work with:
  • dbt for transformation and modeling
  • Snowflake for analytics workloads
  • Develop modular, reusable, and scalable data workflows.
  • Use GitHub / Git for version control.
  • Follow CI/CD and deployment best practices.
  • Document data pipelines and processes.

Required Skills & Technologies

Core Skills

  • Python / PySpark
  • SQL (Strong proficiency)
  • ETL/ELT pipeline development

Cloud & Tools

  • Microsoft Azure:
  • Azure Data Factory (ADF)
  • Azure Databricks
  • ADLS Gen2
  • Microsoft Fabric (preferred)

Data Engineering Concepts

  • Lakehouse architecture
  • Delta Lake
  • Data modeling (basic dimensional modeling)
  • Incremental data processing

Additional Tools

  • dbt (preferred)
  • Snowflake (good to have)
  • Power BI

Experience Requirements

  • 3–5 years of experience in Data Engineering
  • Hands-on experience in:
  • Azure data ecosystem
  • Databricks / PySpark
  • Building data pipelines and transformations

Educational Qualifications

  • Bachelor's degree in Computer Science / Engineering / IT
  • Relevant certifications are a plus:
  • Microsoft Fabric (DP-600 / DP-700)
  • Azure Data Engineer (DP-203)
  • Databricks certifications

Soft Skills

  • Strong analytical and problem-solving skills
  • Ability to work in a collaborative Agile environment
  • Good communication skills
  • Attention to detail and data accuracy

Nice to Have

  • Exposure to real-time/streaming data pipelines
  • Experience with API integrations
  • Basic understanding of machine learning workflows
  • Experience working with ERP/SAP data

Skills & Requirements

Technical Skills

PythonPysparkSqlAzure data factory (adf)Azure databricksAdls gen2Delta lakeDbtSnowflakePower biAnalytical skillsProblem-solving skillsCollaborative agile environmentCommunication skillsAttention to detailData accuracyMicrosoft fabric (dp-600 / dp-700)Azure data engineer (dp-203)Databricks certificationsFinanceHealthcareInsurance

Level

mid

Posted

4/22/2026

Apply Now

You will be redirected to Taraki's application portal.