[Remote] Solution Engineer - Data Engineering Specialist

Flexionis
AU
Remote

Job Description

Note: The job is a remote job and is open to candidates in USA. Snowflake is about empowering enterprises to achieve their full potential — and people too. They are seeking a Data Engineering Specialist to provide technical leadership in designing and architecting the Snowflake Cloud Data Platform. The role involves working with sales teams to understand customer needs and demonstrating the value of Snowflake technology throughout the sales cycle. Responsibilities • Apply your multi-cloud data architecture expertise while presenting Snowflake technology and vision to executives and technical contributors at strategic prospects, customers, and partners • Work hands-on with prospects and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation • Immerse yourself in the ever-evolving industry, maintaining a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them • Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing Skills • 10+ years of architecture and data engineering experience within the Enterprise Data space • 5+ years experience within a pre-sales environment (Sales Engineer, Solutions Engineer, Solutions Architect, etc…) • Outstanding presentation skills to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos • Ability to connect a customer's specific business problems and Snowflake's solutions • Ability to do deep discovery of customer's architecture framework and connect those with Snowflake Data Architecture • Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies. For example, Data Lake, Data Mesh, Data Fabric • Hands on Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies • Deep understanding of data integration services and tools for building ETL and ELT data pipelines such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica • Familiarity with streaming technologies (ex. Kafka, Flink, Spark Streaming, Kinesis) and real-time or near real time use cases (ex. CDC) • Experience designing interoperable data lakehouse architectures and experience working with Iceberg, Delta, and Parquet • Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences, and effectively handle any impromptu questions • Bachelor's Degree required • Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred Benefits • Medical • Dental • Vision • Life • Disability insurance • 401(k) retirement plan • Flexible spending & health savings account • At least 12 paid holidays • Paid time off • Parental leave • Employee assistance program • Other company benefits Company Overview • Snowflake is a cloud data platform that provides a data warehouse as a service designed for the cloud. It was founded in 2012, and is headquartered in San Mateo, California, USA, with a workforce of 5001-10000 employees. Its website is Company H1B Sponsorship • Snowflake has a track record of offering H1B sponsorships, with 428 in 2025, 281 in 2024, 154 in 2023, 182 in 2022, 113 in 2021, 98 in 2020. Please note that this does not guarantee sponsorship for this specific role. Apply tot his job

Skills & Requirements

Technical Skills

SQLPythonPandasSparkPySparkHadoopHiveApache NiFiMatillionFivetranQlikInformaticaKafkaFlinkSpark StreamingKinesisIcebergDeltaParquetpresentationdiscoveryarchitectureclouddata engineeringdata warehouseETLELTdata lakedata meshdata fabricbig datastreamingreal-timeCDC

Level

mid

Posted

3/17/2026

Apply Now

You will be redirected to Flexionis's application portal.