Dice is the leading career destination for tech experts at every stage of their careers. Our client, Systems Management Group, Inc, is seeking the following. Apply via Dice today!
Role:
Data Engineer
Location:
Dallas, TX (5 days Onsite)
Exp:
10+ Yrs
Top Skills:
DataLake, AWS, Migration, Spark, Python/Java, Hadoop, Kafka
Engineer will be part of the datastore-migration Factory team that will be responsible to perform for the end-to-end datastore migration from on-prem DataLake to AWS hosted LakeHouse.
Responsibilities of the Engineer include:
- Logic & Scheduling: Refactoring and migrating extraction logic and job scheduling from legacy frameworks to the new Lakehouse environment.
- Data Transfer: Executing the physical migration of underlying datasets while ensuring data integrity.
- Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "hand-off and sign-off" conversations with data owners to ensure migrated assets meet business requirements.
- Consumption Pattern Migration:
- Code Conversion: Translating and optimizing legacy SQL and Spark-based consumption patterns (raw and modeled) for compatibility with Snowflake and Iceberg.
- Usage analysis: Understand usage patterns to deliver the required data products.
- Stakeholder Engagement: Acting as a technical liaison to internal clients, facilitating "hand-off and sign-off" conversations with data owners to ensure migrated assets meet business requirements.
- Data Reconciliation & Quality
- A rigorous approach to data validation is required. Candidates must work with reconciliation frameworks to build confidence that migrated data is functionally equivalent to that already used within production flows.
Basic Qualifications:
- Education: Bachelor s or Masters in Computer Science, Applied Mathematics, Engineering, or a related quantitative field.
- Experience: Minimum of 3-5 years of professional "hands-on-keyboard" coding experience in a collaborative, team-based environment. Ability to trouble shoot (SQL) and basic scripting experience.
- Languages: Professional proficiency in Python or Java.
- Methodology: Deep familiarity with the full Software Development Life Cycle (SDLC) and CI/CD best practices & K8s deployment experience
Technical Stack Requirements
: Kafka, ANSI SQL, FTP, Apache Spark, JSON, Avro, Parquet, Hadoop (HDFS/Hive), Snowflake, Apache Iceberg, Sybase IQ
Regards,
Prakash
732-9 3 4-4169
prakash.v(@)smg(-)llc(.)us