Back to jobs

AWS Data Engineer

Job description

Location: Sydney (Hybrid)

Type: Contract (6 months with potential extension)

We are seeking an experienced Senior Data Engineer to design, develop, and optimise data pipelines and cloud-based platform components within a modern data ecosystem. This position blends hands-on technical delivery with leadership across data projects, enabling scalable and reusable data assets to support analytics, insights, and operational use cases.

As a senior member of the Data Platform Engineering team, you'll help shape engineering standards, mentor junior engineers, and work closely with cross-functional teams to ensure solutions are high quality, performant, and production-ready.

Key Responsibilities

  • Design, build, and maintain reliable and efficient data pipelines using AWS services (Glue, Lambda, S3) and Snowflake
  • Translate solution and modelling specifications into robust technical implementations
  • Participate in design discussions, peer reviews, and contribute to code quality improvements
  • Implement logging, monitoring, and error-handling features across all data pipelines
  • Apply CI/CD best practices and collaborate with DevOps to automate build, test, and deployment workflows
  • Develop solutions with scalability and reusability in mind
  • Mentor junior engineers and contribute to documentation and team knowledge sharing
  • Work with Data Ops teams to optimise performance and ensure deployment readiness
  • Partner with Solution Designers and Data Modellers to deliver fit-for-purpose data assets
  • Act as a technical liaison between engineering teams and broader project stakeholders

Collaboration

You'll work closely with:

  • Solution Designers and Data Modellers to ensure alignment between design and delivery
  • DevOps Engineers to implement CI/CD and deployment pipelines
  • Data Ops and Test Engineers to support monitoring, troubleshooting, and validation processes
  • Technical Leads and Managers to deliver scalable and maintainable data solutions

Qualifications & Experience

Essential:

  • Bachelor's degree in Computer Science, Software Engineering, Data Engineering, or related field
  • Strong SQL and Python skills
  • Experience with data pipeline orchestration, transformation, and cloud-native tools (Snowflake, AWS Glue, Lambda, S3)
  • Familiarity with CI/CD and version control practices
  • 5+ years' experience in data engineering or related development roles
  • Proven ability to deliver end-to-end data solutions in cloud environments
  • Demonstrated leadership across data delivery workstreams

Desirable:

  • Snowflake and/or AWS certifications
  • Exposure to data quality frameworks or tools such as dbt or Great Expectations
  • Experience building modular and reusable data frameworks
  • Background in public sector, insurance, or regulated industries
  • Familiarity with real-time or machine learning data workflows
  • Experience working within Agile delivery environments