Data Solutions Analyst

Terra Dygital Solutions Inc.
Toronto, CA; US
On-site

Job Description

Company Overview

Terra Dygital Solutions is an innovative IT services provider headquartered in Vancouver, BC. We are dedicated to effectively addressing a diverse range of complex IT challenges by leveraging modern technologies. Our core expertise lies in advising clients on cybersecurity and business/system architecture through our Virtual CIO services, developing cutting-edge solutions via application development and system integration, and managing and optimizing desktop, server, and network environments as a Managed Services Provider. Join our dynamic and rapidly growing team, where you will collaborate with top-notch professionals, work with the latest technologies, and play a crucial role in our clients’ success, all while enjoying significant opportunities for personal and professional growth.

Position Overview The Data Solutions Analyst is a technically well-rounded data professional responsible for designing, building, and delivering data solutions across the Microsoft data stack in a client-facing consulting environment. This role translates complex business requirements into reliable, scalable data pipelines, models, and reporting solutions, then sees those solutions through to production and into the hands of clients. Working closely with client stakeholders and internal delivery teams, the Data Solutions Analyst owns the technical execution of data workstreams: from ETL pipeline development and semantic modelling to Power BI reporting and data warehouse design. This role demands both strong technical depth and the communication skills to engage confidently with clients, document work clearly, and manage concurrent engagements with professionalism and discipline. Job Duties and Responsibilities

  • Engage directly with client stakeholders to gather and translate business requirements into technical data solutions, ensuring alignment on scope, approach, and deliverables.
  • Design and implement ETL/ELT pipelines using Microsoft Fabric and SSIS, applying incremental load strategies and refresh optimization to support reliable, performant data movement.
  • Architect and build data solutions aligned to Medallion Architecture principles (Bronze/Silver/Gold), ensuring clean separation of concerns across ingestion, transformation, and consumption layers.
  • Design data warehouse structures using star and snowflake schemas, with a focus on maintainability, query performance, and scalability.
  • Develop and maintain semantic models to support self-serve analytics and governed reporting across client environments.
  • Write and optimize T-SQL queries and stored procedures to support transformations, business logic, and data validation requirements.
  • Develop Python scripts to automate data transformation workflows, support pipeline logic, and improve delivery efficiency.
  • Build and maintain Power BI reports, dashboards, and semantic models that deliver clear, accurate, and actionable insights to client end users.
  • Apply data quality and validation practices throughout the pipeline lifecycle to ensure accuracy, completeness, and consistency of delivered data.
  • Produce clear, high-quality technical documentation, including solution designs, data dictionaries, pipeline documentation, and operational handoff materials, to support clean project transitions.
  • Manage workload effectively across multiple concurrent client engagements, balancing delivery priorities and communicating proactively on progress and risks.
  • Support post-implementation stabilization, troubleshoot data and pipeline issues, and contribute to continuous improvement of delivered solutions.
  • Participate in internal knowledge-sharing by contributing reusable patterns, templates, and delivery documentation.

Skills and Qualifications

  • Hands-on proficiency with Microsoft Fabric and SSIS for pipeline and ETL development.
  • Solid understanding of Medallion Architecture, data modelling principles, and data warehouse design (star/snowflake schemas).
  • Strong T-SQL skills, including stored procedures, query optimization, and data transformation logic.
  • Working knowledge of Python for scripting, data transformation, and automation.
  • Demonstrated experience developing Power BI solutions, including reports, dashboards, and semantic models.
  • Familiarity with incremental load patterns, refresh scheduling, and pipeline performance optimization.
  • Commitment to data quality; comfortable implementing validation rules, reconciliation checks, and testing practices.
  • Strong client-facing communication skills; able to participate confidently in client meetings, explain technical concepts to non-technical audiences, and maintain professional relationships.
  • Excellent documentation habits and attention to detail in all technical deliverables.
  • Ability to manage and prioritize work across multiple active client engagements without sacrificing quality.

Education and Experience

  • Diploma, certificate, or degree in Computer Science, In

Skills & Requirements

Technical Skills

Microsoft fabricSsisT-sqlPythonPower biCommunicationProblem solvingTeamworkDataEtlData warehousing

Employment Type

FULL TIME

Level

mid

Posted

5/8/2026

Continue to Indeed

You will be redirected to the job posting on Indeed.

Sign in and we'll score your resume against this role.

Find Similar Jobs

Browse roles in the same category, level, and remote setup.

Sign in to open the target role workbench.