Junior Data Engineer (2 years exp only)

Kris Infotech : Technology & Talent - Synced
SG
On-site

Job Description

Job Responsibilities:

  • Design, develop, and maintain end to end data pipelines for ingesting, transforming, and delivering data from multiple source systems (databases, files, APIs, streaming platforms).
  • Build and optimize ETL / ELT workflows using SQL, Python, and enterprise data integration tools.
  • Ensure data pipelines are scalable, resilient, and performant to meet operational and analytical requirements.

Database & Data Platform Management

  • Work hands on with RDBMS platforms such as Oracle, DB2, SQL Server, or PostgreSQL for data extraction, transformation, and performance tuning.
  • Develop and optimize SQL queries, views, and stored procedures to support reporting and analytics use cases.
  • Support data modelling activities (logical and physical) for analytics and reporting layers.

Data Quality, Governance & Operations

  • Implement data validation, reconciliation, and monitoring to ensure data accuracy, completeness, and consistency.
  • Support operational data activities, including incident investigation, root cause analysis, and remediation.
  • Maintain clear documentation for data pipelines, schemas, and operational processes to support audits and knowledge transfer.

Collaboration & Stakeholder Engagement

  • Collaborate with business users, product owners, and downstream teams to gather requirements and translate them into technical solutions.
  • Work closely with Data Analysts, BI developers, and Data Scientists to enable dashboards, reports, and advanced analytics.
  • Participate in Agile ceremonies and contribute to sprint planning, estimation, and delivery

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent practical experience.
  • Strong hands on experience with SQL and relational databases (Oracle, DB2, SQL Server, PostgreSQL).
  • Experience building and supporting ETL / data pipelines in enterprise environments.
  • Solid understanding of data modelling, data quality, and data lifecycle management.
  • Ability to troubleshoot data issues and work in production / operational environments.
  • Experience with Python for data processing or automation.
  • Experience with data streaming technologies (e.g. Kafka, Spark, NiFi).
  • Experience with BI and visualization tools such as Tableau, Qlik, or Power BI.

Skills & Requirements

Technical Skills

SqlPythonEtl / elt workflowsRdbms platformsData modellingData validationData reconciliationData monitoringData pipelinesData schemasOperational processesBi and visualization toolsData streaming technologies

Employment Type

FULL TIME

Level

mid

Posted

4/14/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.