Position: Cloud Data Engineer - AWS & Big Data
Location: Singapore
Type of Employment: Full-Time
Key Result Areas and Activities:
- Share and Build Expertise - Develop and share expertise in Cloud solutioning domain and actively mine the experience and expertise in the organization for sharing across teams and clients in the firm. Support the Cloud COE initiatives
- Nurture and Grow Talent - Provide support for recruitment, coaching and mentoring, and building practice capacity in the firm in Cloud
- Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
- Works closely with a team of frontend and backend engineers, product/project managers, and analysts
Essential Skills:
- Strong understanding of cloud and data engineering concepts in AWS/Azure/GCP
- Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it
- In depth knowledge of key cloud services for data integration, BI and data processing services
- In depth knowledge of cloud storage services & compute services
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues
- Works closely with a team of frontend and backend engineers, product/project managers, and analysts
- Knowledge of containerization services such as Dockers and its orchestration through Kubernetes
- Good expertise on big data services such as Spark, PIG, Hive , Airflow, etc.
- Strong Experience with event stream processing technologies such as Kafka
- Experience with at least one programming language (Java, Scala, Python)
- Knowledge of Operating System (Must have) - Any flavor of Linux, ETL Tools (Good to have), Informatica, Talend
- Deep understanding of cloud computing infrastructure and platforms
- Experience of enabling DevOps automation for AWS or Azure with appropriate security and privacy considerations
- Expertise in developing ETL workflows comprising complex transformations like SCD, deduplications, aggregations, etc.
- Good expertise on databases including NoSQL.
Desirable Skills:
- Experience on cloud DWH such as Snowflake, Redshift, Bigquery, etc. will be a plus
- Experience with at least one major Hadoop platform (Cloudera, Hortonworks, MapR) will be a plus
Qualifications:
- Overall work experience of 7+ years with minimum of 3 to 6 years experience on AWS or Azure or GCP related projects
- BS Degree in IT, MIS or business-related functional discipline
- Experience with or knowledge of Agile Software Development methodologies
Qualities:
- Ability to work individually on complex projects
- Able to guide and mentor junior developers, provide technical expertise, and contribute to team coordination and planning
- Adaptable, quick to learn new tools and frameworks, and resilient in the face of challenges and setbacks
- Outside the box thinking
- Possess strong verbal and written communication skills to convey technical concepts and collaborate effectively with team members, clients and other stakeholder
Location
Singapore
Years Of Exp
7 to 9 years