Mode: Hybrid (Mandatorily need to visit office 3 days (Monday/ Tuesday/ Wednesday))
What you’ll do 1. Claims & Policy Data Analysis
Analyze structured and semi-structured data related to claims, policies, underwriting, and customer interactions.
Identify patterns in claims frequency, fraud indicators, and loss ratios using lakehouse datasets.
Support actuarial teams with data extracts and trend analysis.
Segment customers based on behavior, risk profiles, and product usage.
Analyze customer lifetime value, churn risk, and cross-sell/up-sell opportunities.
Collaborate with risk and compliance teams to monitor exposure and regulatory thresholds.
Prepare data extracts and reports for regulatory bodies (e.g., OSFI, FSRA, NAIC).
Ensure data lineage and traceability for audit and compliance purposes.
Validate data accuracy and completeness for filings and disclosures.
Clean and transform raw data from diverse sources (e.g., policy admin systems, CRM, claims systems) into analytics-ready formats.
Leverage lakehouse tools (e.g., Delta Lake, Apache Iceberg) to manage versioned and time-travel datasets.
Collaborate with data engineers to ensure efficient ETL/ELT processes.
Build dashboards and visualizations for underwriting, claims, finance, and product teams.
Use tools like Power BI, Tableau, or Qlik to present insights from lakehouse data.
Enable self-service analytics by creating reusable datasets and semantic layers.
Profile and validate data to ensure consistency across policy, claims, and financial domains.
Tag and catalog datasets using metadata tools (e.g., Unity Catalog, Collibra).
Support master data management and reference data initiatives.
Work with actuaries, underwriters, product managers, and IT teams to understand data needs.
Translate business questions into analytical queries and data models.
Document business logic, assumptions, and data definitions clearly.
Assist data scientists with feature engineering and exploratory data analysis.
Provide historical data extracts for model training and validation.
Interpret model outputs and integrate them into business reporting.
What you’ll bring
University degree in Computer Engineering or Computer Science.
Minimum 5 years’ of experience successfully leading Data Systems Analysis organizations with expertise in building large-scale enterprise data assets.
8+ years’ experience as a Business Analyst working on mid-large projects for data design, development, and implementation of business-critical enterprise data systems.
Solid grasp/experience with data technologies & tools (Snowflake, Hadoop, PostgreSQL, Informatica, etc.,).
Outstanding knowledge and experience in ETL with Informatica product suite.
Experience establishing documentation standards frameworks for data quality, data governance, stewardship and metadata management.
Ability to foundationally understand complex business process driving technical systems.
Strong leadership and influencing skills at the senior management level.
Strong analytical, critical thinking and problem-solving skills.
Strong stakeholder management.
Solid understanding of Project and Program Management processes.
Excellent verbal and written communication skills.
#J-18808-Ljbffr
FULL TIME
mid
5/5/2026
You will be redirected to Galent's application portal.
Sign in and we'll score your resume against this role.
Browse roles in the same category, level, and remote setup.