Principal Fullstack Data Engineer

Las Vegas Sands Corp.
Dallas, US
On-site

Job Description

Job Description:

Position Overview

The primary responsibility of the Principal Fullstack Data Engineer is to lead the design and implementation of our data architecture, pipelines, and analytics portal for a casino management system being developed from the ground up. This role requires extensive technical expertise and leadership skills to build scalable, reliable, and high-performance data solutions using a modern Iceberg Data Lakehouse architecture that supports real-time processing, analytics, and reporting. The Principal Fullstack Data Engineer will also contribute to hands-on development of a React.js/Node.js Reporting & Analytics Portal and collaborate with cross-functional teams to ensure seamless integration and data flow across various systems.

All duties are to be performed in accordance with departmental and Las Vegas Sands Corp.’s policies, practices, and procedures. All Las Vegas Sands Corp. Team Members are expected to conduct and carry themselves in a professional manner at all times. Team Members are required to observe the Company’s standards, work requirements and rules of conduct.

Essential Duties & Responsibilities

  • Lead the design and development of a robust Iceberg Data Lakehouse architecture using Apache Iceberg, Starburst/Trino, and Lakekeeper , ensuring it is scalable, secure, and adaptable for future needs.
  • Architect, develop, and maintain complex data pipelines using Python and Apache Kafka for efficient data ingestion, transformation, and storage into Iceberg tables, ensuring high availability and quality.
  • Design and develop features for the React.js/Node.js Reporting & Analytics Portal, including interactive dashboards, canned reports, self-service analytics, and scheduled data exports.
  • Build and maintain RESTful APIs using Node.js to serve data from the lakehouse to the analytics portal, supporting JSON and CSV export formats for use cases such as regulatory reporting (e.g., DICJ access).
  • Oversee the integration of diverse data sources (e.g., transactional systems, third-party APIs, IoT devices) through Kafka event streaming and Starbust query federation to create a unified data ecosystem for the casino management system.
  • Define and implement data ingestion and transformation processes using a Lambda-Kappa hybrid architecture that supports both batch and real-time analytics, optimizing data movement and processing efficiency.
  • Develop and maintain PostgreSQL database schemas and queries to support application state, metadata management, and portal functionality.
  • Establish data governance policies and best practices using Starburst’s governance capabilities to ensure compliance with regulatory standards, data security, and privacy protocols.
  • Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions – including embedded Metabase dashboards and custom portal features – that enable data-driven decision-making.
  • Monitor and optimize data pipeline performance using Prometheus and Grafana, identifying bottlenecks and implementing enhancements to ensure rapid data processing and retrieval.
  • Provide technical leadership and mentorship to data engineering teams, promoting best practices in data engineering, front-end development, and fostering a culture of innovation.
  • Maintain comprehensive documentation of data architecture, API contracts, portal features, workflows, and processes to support ongoing development and operational maintenance.
  • Perform job duties in a safe manner.
  • Attend work as scheduled on a consistent and regular basis.
  • Perform other related duties as assigned.

Minimum Qualifications

  • At least 21 years of age.
  • Proof of authorization to work in the United States.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
  • Must be able to obtain and maintain any certification or license, as required by law or policy.
  • 8+ years of experience in data engineering, with at least 3 years in a principal or lead role, preferably in the gaming or casino industry.
  • Expertise with Apache Kafka for real-time event streaming and data pipeline orchestration at scale.
  • Hands-on experience with Apache Iceberg table formats, including partitioning strategies, schema evolution, and snapshot management.
  • Experience with Starburst/Trino (or other modern query engine like StarRocks, Dremio) as a distributed SQL query engine for analytics, reporting, and query federation across heterogeneous data sources.
  • Hands-on experience with React.js for building interactive front-end analytics dashboards and reporting interfaces.
  • Hands-on experience with Node.js for building RESTful APIs and back-end services that serve data to web applications.
  • Deep knowledge of PostgreSQL (16+) including advanced query optimization, indexing strategies, and schema design for both application and analytical workloads.
  • Experience with Iceberg catalog managem

Skills & Requirements

Technical Skills

Apache icebergStarburst/trinoLakekeeperPythonApache kafkaReact.jsNode.jsPostgresqlMetabasePrometheusGrafanaLeadershipCollaborationProblem-solvingData engineeringData architectureAnalytics portalCasino management system

Employment Type

FULL TIME

Level

principal

Posted

4/20/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.