Big Data Platform Engineer – Singapore

Intellect Minds Pte Ltd
SG
On-site

Job Description

Role Description

You are operating Global Data Platform components (VM Servers, Kubernetes, Kafka) and applications (Apache stack, Collibra, Dataiku and similar)

Implement automation of infrastructure, security components, and Continuous Integration & Continuous Delivery for optimal execution of data pipelines (ELT/ETL).

Develop solutions to build resiliency in data pipelines with platform health checks, monitoring, and alerting mechanisms, quality, timeliness, recency, and accuracy of data delivery are improved

Apply DevSecOps & Agile approaches to deliver the holistic and integrated solution in iterative increments.

Liaison and collaborate with enterprise security, digital engineering, and cloud operations to gain consensus on architecture solution frameworks.

Review system issues, incidents, and alerts to identify root causes and continuously implement features to improve platform performance.

Be current on the latest industry developments and technology trends to effectively lead and design new features/capabilities.

Experience

You have 5+ years of experience in building or designing large-scale, fault-tolerant, distributed systems

Migration experience of storage technologies (e.g. HDFS to S3 Object Storage)

Integration of streaming and file-based data ingestion / consumption (Kafka, Control M, AWA)

Experience in DevOps, data pipeline development, and automation using Jenkins and Octopus (optional: Ansible, Chef, XL Release, and XL Deploy)

Experience predominately with on-prem Big Data architecture, cloud migration experience might come handy

Hands-on experience in integrating Data Science Workbench platforms (e.g. Dataiku)

Experience of agile project management and methods (e.g., Scrum, SAFe)

Supporting all analytical value streams from enterprise reporting (e.g. Tableau) to data science (incl. ML Ops)

Skills

Hands-on working knowledge of large data solutions (for example: data lakes, delta lakes, data meshes, data lakehouses, data platforms, data streaming solutions…)

In-depth knowledge and experience in one or more large scale distributed technologies including but not limited to: Hadoop ecosystem, Kafka, Kubernetes, Spark

Expert in Python and Java or another static language like Scala/R, Linux/Unix scripting, Jinja templates, puppet scripts, firewall config rules setup

VM setup and scaling (pods), K8s scaling, managing Docker with Harbor, pushing Images through CI/CD

Experience using data formats such as Apache Parquet, ORC or Avro Experience in machine learning algorithms is a plus.

Knowledge of financial sector and its products

Higher education (e.g. “Fachhochschule”, “Wirtschaftsinformatik”)

Must have :

Hadoop

Spark

Kafka

Devops CI/CD

Kindly share your resume to “recruit@intellect-minds.com”

Skills & Requirements

Technical Skills

Big dataData platformVm serversKubernetesKafkaApache stackCollibraDataikuDevsecopsAgileDevopsData pipeline developmentAutomationJenkinsOctopusAnsibleChefXl releaseXl deployTableauData science workbench platformsDataikuScrumSafeHadoop ecosystemKafkaKubernetesSparkPythonJavaScalaRLinux/unix scriptingJinja templatesPuppet scriptsFirewall config rules setupVm setup and scalingK8s scalingManaging docker with harborPushing images through ci/cdApache parquetOrcAvroMachine learning algorithmsLeadershipCommunicationCollaborationProblem-solvingTeamworkTechnical communicationProject managementFinanceHealthcareTechnologyIgamingAiMlData engineeringAnalytics

Employment Type

FULL TIME

Level

senior

Posted

4/9/2026

Continue to Indeed

You will be redirected to the job posting on Indeed.