🌟 We're Hiring: Data Engineer! 🌟
We are seeking a skilled Data Engineer to join our dynamic team in Singapore. The ideal candidate will have hands-on experience in data modeling, ETL processes, and database management. You will play a crucial role in building and optimizing our data pipelines to support business intelligence and analytics.
📍 Location: Singapore, Singapore
⏰ Work Mode: Work From Office
💼 Role: Data Engineer
Key Responsibilities:
- Transform and clean data using SQL & Python in Databricks
- Build dashboards using Power BI / Tableau
- Ensure data quality and consistency
- Collaborate with stakeholders to deliver insights
- Create documentation for data workflows and dashboards
- Data Pipeline Development: Design, build, and maintain scalable ETL/ELT processes using Databricks and Spark.
- Spark Optimization: Tune and optimize Spark jobs and data transformations for performance.
- Lakehouse Architecture: Implement data reliability, governance, and security using Delta Lake and Unity Catalog.
- Workflow Orchestration: Automate data workflows using Databricks Workflows or tools such as Airflow.
- Collaboration: Work with analysts and business stakeholders.
Requirements:
- Strong SQL and Python skills
- Experience with dashboarding tools (Power BI/Tableau)
- Familiarity with Databricks or big data platforms
- Languages: PySpark, SQL.
- Databricks Features: Delta Lake, Structured Streaming, Delta Live Tables (DLT).
- Cloud Platforms: AWS, Azure, or GCP.
- DevOps: Git, Terraform, CI/CD practices.
- Tools: Databricks Notebooks, Databricks Repos.
Good to Have:
- Spark / PySpark
- Git, JIRA, or QA/testing experience