Location: Los Angeles, CA
Onsite/ Hybrid/ Remote: Hybrid
Duration: 12 Months
Rate Range: Max is $96/hr on C2C or $89/hr on W2
Work Authorization: GC, USC, All valid EADs except H1B, OPT, CPT
Must Have:
- Databricks
- Snowflake
- Redshift
- AWS (S3, Glue, Lambda)
- Advanced SQL (performance tuning)
- Python
- Airflow or DBT
Responsibilities:
- Design and build scalable data pipelines for large datasets
- Develop batch and real-time data processing solutions
- Work with Databricks, Snowflake, and Redshift for data platforms
- Optimize SQL queries and improve data performance
- Build and maintain workflows using Airflow or DBT
- Ensure data quality, reliability, and governance
- Partner with data scientists to deploy ML models
- Collaborate with teams to translate business needs into data solutions
Qualifications:
- 7+ years of data engineering experience
- Strong experience with AWS data services
- Hands-on coding in Python and SQL
- Experience with distributed data systems and big data tools
- Experience building enterprise-scale data platforms
Nice to Have:
- Monte Carlo (data observability)
- Atlan (data catalog)
- CI/CD (GitHub Actions or Jenkins)
- Experience with ML or statistical modeling