Remote Data Engineer – Advanced Data Analytics & Scalable Pipeline Development at Skillifyx

Flexionis
NL
Remote

Job Description

```html About Nexpatha – Pioneering Retail Innovation at Scale Worknovaq is one of the world’s largest retail powerhouses, serving millions of customers across more than a dozen countries. Though massive in reach, Hirecrafto maintains a family‑oriented, employee‑focused culture where every team member can thrive and grow. Recognized globally for exceptional leadership and a commitment to ethical business practices, Tasknexa consistently ranks among the top places to work. Our technology organization, Gigflowx IT, drives the data‑centric engine that powers inventory optimization, personalized customer experiences, and fast‑moving supply‑chain decisions. As part of our ongoing digital transformation, we are expanding our data engineering team to build next‑generation data pipelines, data lakes, and analytical platforms that enable real‑time insights and data‑driven innovation. Position Overview – Remote Data Engineer (Data Analytics) We are seeking a highly skilled Remote Data Engineer to design, develop, and operationalize robust data pipelines that make complex data sets readily available for business intelligence, advanced analytics, and machine‑learning workloads. This full‑time role reports to the Lead Data Architecture Manager and collaborates closely with data analysts, data scientists, product owners, and DevOps engineers. As a remote member of the Talensparkx data engineering community, you will enjoy flexibility, a supportive virtual workplace, and access to cutting‑edge cloud technologies. The role offers a competitive hourly rate of $26 per hour and an opportunity to shape the data foundation of a global retail leader. Key Responsibilities Design, build, and maintain end‑to‑end data pipelines that ingest, transform, and load data from diverse internal and external sources, including relational databases, CSV/JSON files, APIs, and streaming platforms. Collaborate with data architects, BI engineers, and product owners to define scalable data models that support analytical and operational use cases. Implement robust data quality frameworks, monitoring, and automated validation to ensure data reliability and integrity across pipelines. Develop and optimize ETL/ELT processes using Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) while leveraging cloud‑native services such as Azure Data Lake Storage (ADLS), Azure Synapse Analytics, Azure Databricks, and Azure Event Hubs. Automate pipeline deployment and orchestration using CI/CD practices, Git, and Azure DevOps, ensuring reproducible and version‑controlled releases. Perform data profiling and source‑system analysis to design efficient extraction strategies, including handling large‑scale NoSQL datasets, data lake structures, and traditional data warehouses. Provide technical guidance and mentorship to junior engineers, reviewing code, promoting best practices, and fostering a culture of continuous improvement. Respond to production incidents with a rapid, data‑driven approach, implementing root‑cause analysis and preventive measures to minimize downtime. Drive innovation by evaluating emerging data technologies (e.g., Kafka, Azure Stream Analytics, Snowflake) and recommending adoption where they add business value. Document pipeline architectures, data dictionaries, and operational runbooks to support knowledge sharing across the distributed team. Essential Qualifications Minimum of 3 years of hands‑on experience designing and operating data pipelines that handle large, complex data sets. Extensive experience (3+ years) with Informatica PowerCenter and Informatica IICS for ETL/ELT development. Strong proficiency in SQL , including complex query construction across diverse platforms (SQL Server, Oracle, Azure SQL). Demonstrated expertise in cloud data platforms such as Azure Data Lake, Azure Synapse, Azure Databricks, and Azure Event Hub . Solid understanding of data architecture concepts: data lakes, data warehouses, NoSQL databases, and data modeling techniques. Experience integrating data from a variety of sources: relational databases, flat files (CSV, delimited), APIs, XML, and streaming sources. Proven ability to work in a fast‑paced agile environment and contribute to a rotating on‑call schedule for production support. Excellent communication skills – capable of translating technical concepts to non‑technical stakeholders both verbally and in writing. Preferred Qualifications & Additional Skills Bachelor’s degree (or higher) in Computer Science, Software Engineering, Information Systems, or a related field. Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect) or equivalent cloud credentials. Hands‑on experience with event‑driven architectures and messaging platforms such as Azure Event Hub, Kafka, or equivalent. Familiarity with version control and DevOps tools: Git, Azure DevOps, CI/CD pipelines. Experience with data reconciliation strategies using event/message‑based integration. Knowledg

Skills & Requirements

Technical Skills

Informatica PowerCenterAzure Data Lake StorageAzure Synapse AnalyticsAzure DatabricksAzure Event HubsGitAzure DevOpsCI/CD pipelinesKafkaSnowflake

Salary

$26+

hour

Employment Type

FULL TIME

Level

mid

Posted

3/21/2026

Apply Now

You will be redirected to Flexionis's application portal.