Data engineer (Azure ETL)

Techcarrot
Washington, US
Remote

Job Description

Job Summary

Azure ETL, Synapse DWH, Google Cloud Platform, Google BigQuery, MSBI-SSAS, Power BI Developer.

​This role will be responsible for creating Data orchestration with Azure Data Factory Pipelines & Dataflows and using Google Cloud Platform and Google BigQuery. Key role includes to understand the business requirements and implement the reporting using Power BI

Roles & Responsibilities

  • Understand business requirement and actively provide inputs from Data perspective.
  • Understand the underlying data and flow of data.
  • Build simple to complex pipelines & dataflows in ADF.
  • Should be able to implement modules that has security and authorization frameworks.
  • Recognize and adapt to the changes in processes as the project evolves in size and function.
  • Implement Visualization and traditional reporting in Power BI.

Requirements

Knowledge, Skills & Abilities

  • Expert level knowledge on Azure Data Factory and Power BI.
  • Expert level knowledge on Google Cloud Platform and Google BigQuery.
  • Analytics experience, Real Estate domain knowledge is a plus.
  • Advance knowledge of Azure SQL DB & Synapse Analytics, Power BI, SSIS, SSAS, Logic Apps.
  • Should be able to analyze and understand complex data.
  • Security Implementation like RLS in Power BI and data governance in Azure.
  • Knowledge of Azure data lake, SQL Databases, Stream Analytics, Azure DevOps and CI/CD.
  • Knowledge of master data management, data warehousing and business intelligence architecture.
  • Experience in data modeling and database design with excellent knowledge of SQL Server best practices.
  • Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision.
  • Should have clear understanding of DW lifecycle and contribute in preparing Design documents, Unit Test plans, Code review reports.
  • Experience working in Agile environment (Scrum, Lean, Kanban) is a plus.
  • Knowledge of other BI tools like Qlik is a big plus.
  • Knowledge of Big data technologies –Spark Framework, NoSQL, Azure Data Bricks and Haqoop Ecosystem- Hive, Impala, HDFS, YARN, Pig, Oozie is a plus.

Qualifications & Experience

  • Bachelor or master’s degree in computer science or related field.
  • At least 6-10 years of Data engineering or Software development experience.

Skills & Requirements

Technical Skills

Azure data factoryPower biGoogle cloud platformGoogle bigqueryAzure sql dbSynapse analyticsSsisSsasLogic appsAzure data lakeSql databasesStream analyticsAzure devopsCi/cdMaster data managementData warehousingBusiness intelligence architectureData modelingDatabase designSql serverAzure devopsQlikBig data technologiesSpark frameworkNosqlAzure data bricksHaqoop ecosystemHiveImpalaHdfsYarnPigOozieCommunicationInterpersonal skillsData engineeringSoftware development

Salary

$70,000+

year

Employment Type

FULL TIME

Level

senior

Posted

4/28/2026

Apply Now

You will be redirected to Techcarrot's application portal.

Sign in and we'll score your resume against this role.