Data Architect (Life & Annuity Insurance Experience)

Envision Technology Solutions
Charlotte, US
Remote

Job Description

Dear Applicant,

Please let me know if you are interested.

πŸ‘¨ πŸ’Ό Job Title: Data Architect (Life & Annuity Insurance Experience – Mandatory)

🌍 Work Location: Remote (Working hours aligned with EST)

πŸ’° Hire Type - Fulltime

⏳ Total Experience:15+ Years and ➑️ Minimum 5+ years in Insurance Domain (Mandatory)

βœ… Must-Have Skills

  • 🏦 Insurance Domain Experience: 5+ years
  • πŸ” ETL Development
  • 🐍 Python
  • ⚑ PySpark
  • ☁️ AWS Cloud

🧾 Job Summary

The Data Architect will lead the design and maintenance of the data infrastructure that enables secure data storage, access, and analytics across business and client needs. This role defines how data is collected, structured, integrated, and governed to ensure scalability, security, and performance.

The position plays a critical role in supporting business intelligence, regulatory, and compliance initiatives within the Life & Annuity Insurance domain.

πŸ“† Years of Experience Required

  • 15+ years in Data Engineering
  • Minimum 5+ years exclusively in the Insurance domain
  • πŸ”‘ Key Responsibilities
  • 🧩 Define data architecture strategies, frameworks, and models
  • πŸ—„οΈ Design and optimize databases, data warehouses, and data lakes
  • πŸ“œ Ensure data structures align with business requirements and compliance standards
  • 🀝 Collaborate with data engineers, analysts, and client teams on pipeline architecture
  • πŸ” Oversee metadata management, data catalogs, and lineage tracking
  • πŸš€ Ensure data integrity, scalability, and performance
  • ☁️ Select appropriate storage and cloud platforms (Snowflake, AWS, Azure, BigQuery, Redshift)
  • πŸ” Support data governance and access control policies
  • πŸ”„ Review existing systems for enhancements, optimization, or migration
  • πŸ“ Document technical standards, architectural designs, and decisions
  • 🧭 Lead system design with strong strategic alignment and governance
  • πŸ› οΈ Technical Skills
  • πŸ“Š Experience designing, implementing, and managing data analytics solutions using Databricks in the Insurance domain
  • πŸ” Proven experience building ETL pipelines from multiple data sources using Databricks on AWS
  • βš™οΈ Design and implement scalable ETL pipelines to process and transform large datasets
  • 🀝 Collaborate with data scientists, analysts, and stakeholders to deliver high-quality data solutions
  • βœ… Optimize data workflows while ensuring data quality and integrity
  • πŸ”Ž Monitor and troubleshoot data pipeline performance, implementing improvements as needed
  • ☁️ Leverage AWS cloud services for efficient data storage and processing
  • πŸ“˜ Document data engineering processes, architecture, and best practices
  • 🌐 Stay up to date with emerging data engineering and cloud technologies
  • πŸ’» Strong proficiency in Python, PySpark, and SQL with hands-on data pipeline development
  • 🧱 Experience in Data Modeling, Data Lineage, and Canonical Data Model implementation
  • πŸ₯‡ Experience implementing Medallion Architecture
  • 🏦 Strong experience in the Insurance domain
  • 🧾 Experience working with JIRA

⭐ Mandatory Skills

  • ⚑Strong expertise in Databricks, including ETL pipeline development and optimization
  • ☁️ Proven experience with AWS cloud services
  • πŸ—„οΈ Solid understanding of data modeling, data warehousing, and data integration
  • πŸ‘¨ πŸ’» Proficiency in Python or Scala for data transformation
  • πŸ“Š Strong hands-on experience with Python, PySpark, and SQL
  • πŸ›‘οΈ Familiarity with data governance and data quality frameworks
  • πŸ“ˆ Experience with Power BI
  • πŸ§‘ πŸ’» Experience using GitLab
  • πŸ”„ Experience working in Agile methodologies
  • πŸŽ“ Education & Certifications
  • πŸŽ“MCA or bachelor’s degree in engineering from a reputed institution
  • βœ… Databricks Certification – Preferred
  • βœ… LOMA Certification (Insurance) – Preferred

Skills & Requirements

Technical Skills

PythonPySparkAWS CloudETL DevelopmentDatabricksETL pipelinesdata modelingdata warehousingdata integrationdata governancedata qualityPower BIGitLabAgile methodologiesDatabricks CertificationLOMA Certificationinsurance

Employment Type

FULL TIME

Level

mid

Posted

4/13/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.