Data Engineer / Analytics Developer (Databricks | Fabric | Power BI)

Clarrow
Hong Kong, HK
Hybrid

Job Description

We are seeking a versatile Data Engineer / Analytics Developer to join a high impact data transformation programme within a leading organisation in Hong Kong.

This is a hybrid role combining data engineering, data modelling, and business intelligence, ideal for someone who enjoys working across the full data lifecycle, from building pipelines to delivering meaningful insights to stakeholders.

Key Responsibilities:

  • Design, build, and optimise data pipelines using Spark / PySpark (batch, incremental, CDC)
  • Develop dimensional data models (star/snowflake schemas) to support analytics
  • Create and deliver interactive Power BI dashboards and reports
  • Work across Databricks and Microsoft Fabric (Lakehouse, Data Factory, Warehouse)
  • Integrate and transform data from multiple sources (SQL, APIs, etc.)
  • Implement data quality checks, validation, and documentation
  • Collaborate with business stakeholders to translate requirements into data solutions
  • Take ownership of end-to-end delivery (requirements → build → deployment)

Requirements:

  • 5–7 years of experience in data engineering, analytics, or BI development
  • Hands-on experience with Databricks (Spark/PySpark) and/or Microsoft Fabric
  • Strong SQL and data modelling expertise
  • Proven experience building Power BI dashboards (DAX, Power Query)
  • Experience working with modern data platforms (Lakehouse / Warehouse)
  • Strong communication skills and ability to work with both technical and business teams

Other Competencies:

  • Strong understanding of data architecture, modelling, and analytics workflows
  • Ability to bridge data engineering and business intelligence functions effectively
  • Experience with data quality frameworks, testing, and governance practices
  • Familiarity with CI/CD, version control, and collaborative development practices
  • Hands-on experience with Row-Level Security (RLS) and secure data access design

Why Apply:

  • Work on a modern data stack (Databricks + Fabric)
  • Opportunity for true end-to-end ownership across engineering and analytics
  • High visibility role with strong stakeholder interaction
  • Collaborative, international working environment

Skills & Requirements

Technical Skills

DatabricksSparkPysparkPower biSqlData modellingLakehouseData factoryWarehouseApisData quality checksValidationDocumentationRow-level securitySecure data access designCi/cdVersion controlCollaborative development practicesCommunicationAbility to work with both technical and business teamsData engineeringBusiness intelligence

Level

mid

Posted

4/22/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.