Data / Records Management, IT & Telecomms

We are partnering with a global tech consultancy to find a Data Engineer for an initial 3 month contract.

If you’re someone who enjoys designing scalable data pipelines and building modern, high performance data platforms, this is a great opportunity for you!

– Must be Brisbane based as this is a hybrid work environment.
– 12 week contract with the possibility to extend.
– You will demonstrate your expertise and gain exposure to innovative tech within a short term, high impact project.

What You’ll Be Doing
:

  • Build scalable batch and streaming pipelines with Python and PySpark
  • Design Delta Lake architectures on Databricks
  • Orchestrate workflows and jobs in Databricks
  • Tune performance and manage code with Databricks libraries
  • Manage AWS S3 data lakes for secure data access
  • Deploy infrastructure using Terraform or CloudFormation
  • Automate AWS services using Boto3
  • Collaborate across teams to ensure data reliability
  • Maintain data quality and observability standards

What We’re Looking For:

  • Proven experience in data engineering with Databricks and AWS.
  • Strong programming skills in Python and PySpark.
  • Hands-on experience with Delta Lake and structured streaming.
  • Deep understanding of data lake architecture and ETL pipeline design.
  • Experience with Terraform, CloudFormation, or similar IaC tools.
  • Strong problem-solving skills and ability to work autonomously in a fast-paced environment.
  • Excellent communication and collaboration abilities.
  • Experience working in a multi cloud or multi account AWS environment will be highly regarded.
  • Familiarity with data governance, data cataloguing, and security best practices is a bonus.

Does this sound like you, please apply here!