Back to jobs

Senior Data Engineer - Azure / Python

Job description


This disruptive IoT/Data SaaS product unlocks the true potential of smart buildings and infrastructure. they are writing a new chapter in human history, with unprecedented resource optimisation and management empowered by data.

For the second year in a row (2020 & 2021), they have been ranked in Linked In's Australian "Top 25 Start-ups". You will be joining a team of performance-driven individuals, backed by the most advanced technology the built world has ever seen.

Summary of role:

As a Senior Data Engineer, you will work on end-to-end, mostly greenfield, projects using a modern cloud tech/data stack within the emerging Digital Twins technology space. This is a dynamic environment with a lot of interesting data engineering challenges to solve. And it's also a great opportunity to upskill on Azure data services/Snowflake, a leading cloud data warehouse?platform.

Role & Experience:

    • Drive the company-wide rollout of our Snowflake data warehouse and take ownership of the end-to-end process from requirements gathering to implementation.
    • Design, build and maintain data pipelines ensuring data quality, efficient processing, and timely delivery of accurate and trusted data.
    • Ensure performance, security, and availability of the data warehouse.
    • Establish ongoing end-to-end monitoring for the data pipelines.
    • Help set up and maintain CI/CD pipelines to support the data warehouse.
    • Interface with the CyberSec team to ensure consistent application of standard security policies.

Skills & Experience:

    • 3+ years of recent commercial experience building and optimizing data pipelines, architectures and datasets in Azure and a total of 6+ years in a data engineering, data warehousing role or software engineering.
    • Demonstrable experience implementing and optimizing both batch and streaming data pipelines.
    • Understanding of modern data warehouse/data lake modelling.
    • Solid commercial experience with Azure services, including Azure Data Factory, Azure Databricks, Azure Data Lake, Azure Functions, Azure Key Vault, etc.
    • Coding proficiency in at least one modern programming language (preferably Python, but C# accepted)
    • Experience in developing automated build and deployment pipelines using YAML, ARM templates and PowerShell.
    • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
    • Expertise in performance tuning and troubleshooting.
    • You should be an active learner, passionate about data and new technologies as well as be comfortable recommending new and better ways to do things to the team.
    • You need to be very comfortable collaborating with data analysts, data scientists, software engineers as well as talking to product managers and other business stakeholders and explaining data concepts.
    • Attention to detail and strong problem-solving skills.
    • Tertiary qualification in Computer Science or similar relevant field.

Nice To Have:

    • Practical experience with Snowflake cloud data warehouse is very highly regarded
    • Experience with any of the BI reporting tools e.g., Power BI, Sigma, Qlik, Tableau.
    • Working knowledge of Apache Spark and Databricks platform is a plus.
    • Experience with Git or a similar version control/source code management tool is desired.
    • Azure certification.

this is an opportunity to work for the next Atlassian, a global organisation driven out of Australia.