You will be part of a team of 8 Data Engineers driving operational data pipelines and data products.
In this role you will be responsible for minimalizing manual work and driving the reuse of various models and data.
- you will work with big data technologies.
- maintain legacy data infrastructure.
- Build infrastructure to support data pipelines using AWS and Python.
- Work with large complex data sets to meet cross functional business requirements.
- Acquisition, ingestion, transformation, curation and productization of data assets
- +5 years experience in a similar role
- Extensive experience with AWS, Python, ETL and Snow flake
- Experience developing, optimising and automating data extract, transform and load routines to create a coherent high-quality comprehensive data.
- Growing the capabilities of the data platform(s), solving new data problems and challenges.
If this sounds like you, Hit Apply !!