Hi,
This is Kumar Saurabh from HireOn Tech. I have Urgent Job opening for you. Let me know if you are looking for Better Opportunity to work with us.
Role / Skill - Data Modeler/ Data Business Analyst (Capital Market)
Location: Toronto, Day 1 Onsite - with 3 days at office.
JD:
- 1. Capital Markets Data Modeling
- Design and maintain physical and logical data models for Capital Markets domains, including trade execution, portfolio positions, market data (ticks/pricing), and risk metrics.
- Translate complex financial hierarchies (e.g., fund-of-funds, multi-asset class structures) into optimized Databricks structures.
- Ensure data models support temporal requirements, such as Point-in-Time (PIT) analysis and "As-Of" financial reporting.
- 2. Advanced STM (Source-to-Target Mapping) Creation
- Lead the creation of comprehensive STM documents, detailing the journey from legacy financial systems and external providers (e.g., Bloomberg, Reuters, Aladdin) to ADLS Gen2.
- Define complex transformation logic, business rules, and data enrichment steps within the STM to guide Data Engineering squads.
- Map data lineage to ensure full traceability for regulatory compliance.
- 3. Databricks & ADLS Performance Engineering
- Design Delta Lake structures that prioritize "Shuffle-free" joins for massive capital markets datasets.
- Implement optimized partitioning and Z-Ordering strategies specifically for time-series financial data to enable high-speed analytics.
- Utilize Unity Catalog to govern data access and maintain a centralized metadata repository for the GWAM ecosystem.
- ________________________________________
Required Skills & Qualifications
- Industry Expertise: 5+ years of experience in Capital Markets or Wealth Management, with a deep understanding of financial instruments and trade lifecycles.
- Technical Modeling: 7+ years of experience in data modeling, with a mastery of Azure Databricks and ADLS Gen2.
- STM Mastery: Proven track record of creating highly detailed Source-to-Target Mappings for complex data migration or integration projects.
- Data Engine Proficiency: Expert-level Spark SQL and PySpark. Ability to optimize data structures for the Spark Catalyst Optimizer.
- Storage Formats: Expertise in Delta Lake (ACID transactions, Time Travel) and Parquet optimization.
- Governance: Hands-on experience implementing data governance and security via Unity Catalog.
________________________________________
Technical Stack
- Compute: Azure Databricks (Jobs, SQL Warehouses).
- Storage: ADLS Gen2 (Delta/Parquet).
- Governance: Unity Catalog.
- Analysis Tools: SQL, Python, Excel (for data profiling).
- Documentation: Confluence/Visio for STM and ERDs.