Role Overview
We are looking for a Data Engineer/Analytics Engineer to join a cross‑functional technology team responsible for building and maintaining modern financial data platforms. This role focuses on the design and delivery of reliable data transformation layers that support sub‑ledger reporting, reconciliation, and downstream analytics.
The role is hands‑on and execution‑focused , working on Snowflake and Sigma to transform transactional data into governed, analytics‑ready models. You will collaborate closely with finance, product control, engineering, and analytics partners to ensure data accuracy, transparency, and usability.
This is an ideal role for an engineer early in their finance‑data journey who wants to develop strong foundations in financial data modeling, controls, and modern cloud tooling .
Key Responsibilities
- Build and maintain modular modules and transformation pipeline in Snowflake to support sub‑ledger and financial reporting use cases.
- Work closely with analysts, finance partners, and engineers to understand requirements.
- Develop transformation logic to convert raw transactional data into clean, well‑structured fact and dimension tables .
- Enforce data quality through rigorous testing, documentation, and version control following DevOps processes.
- Participate in Agile ceremonies (e.g., stand-ups, sprint planning) and manage tasks using Jira .
- Optimize Snowflake performance (Clustering, Warehouses, Query tuning).
- Implement and follow data access controls and security practices for regulated financial data.
- Help automate workflows using Airflow, Snowflake Tasks or TWS .
- Monitor pipeline executions, investigate failures, and support issue resolution.
Support end users and downstream data consumers
Mandatory Qualifications & Skills
- 3–5 years of experience in data engineering or analytics engineering roles.
- Hands‑on experience working with Snowflake as a cloud data warehouse.
- Strong proficiency in SQL and understanding of dimensional data modeling concepts.
- Working knowledge of Python for pipeline integration or orchestration tasks.
- Familiarity with Git‑based version control and DevOps practices.
- Ability to work effectively with both technical and finance stakeholders .
- Experience working in Agile/Scrum environments .
- Experience with Analytics/BI tools such as Sigma or Qlik is a huge plus.
Nice‑to‑Have Skills
- Exposure to financial or sub‑ledger datasets (trades, positions, balances, movements).
- Familiarity with data vault and modeling concepts
- Understanding of MCP and AI Agents .
Education & Experience
- Bachelor’s degree in Computer Science, Data Engineering, Finance Technology, or a related field.
- Prior experience working in cross‑functional teams involving engineers, analysts, and business users.
- Strong attention to detail, willingness to learn, and a focus on data accuracy and reliability .