Back to jobs

Senior Snowflake Data Engineer

Job description



Our client is a is a disruptive IoT/Data SaaS that unlocks the true potential of smart buildings and infrastructure. they are writing a new chapter in human history, with unprecedented resource optimisation and management empowered by data.

Summary of role:

As a Senior Data Engineer, you will work on end-to-end, mostly greenfield, projects using a modern cloud tech/data stack within the emerging Digital Twins technology space. This is a dynamic environment with a lot of interesting data engineering challenges to solve. And it's also a great opportunity to upskill on Azure data services/Snowflake, a leading cloud data warehouse?platform.

Role & Responsibilities:

    • Drive the company-wide rollout of our Snowflake data warehouse and take ownership of the end-to-end process from requirements gathering to implementation.
    • Design, build and maintain data pipelines ensuring data quality, efficient processing, and timely delivery of accurate and trusted data.
    • Ensure performance, security, and availability of the data warehouse.
    • Establish ongoing end-to-end monitoring for the data pipelines.
    • Help set up and maintain CI/CD pipelines to support the data warehouse.
    • Interface with the CyberSec team to ensure consistent application of standard security policies.

Skills & Experience:

    • 2+ years of recent commercial experience with Snowflake and a total of 6+ years in a data warehousing or data engineering role.
    • Demonstrable experience designing and implementing modern data warehouse/data lake solutions with an understanding of best practices.
    • Hands-on development experience with Snowflake data platform including Snowpipes, tasks, stored procedures, streams, resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, cloning, time travel, data sharing and understanding how to use these features.
    • Advanced proficiency in writing complex SQL statements and manipulating large structured and semi-structured datasets.
    • Coding proficiency in preferably in Python, but C# accepted
    • Expertise in performance tuning and troubleshooting.
    • You should be an active learner, passionate about data and new technologies as well as be comfortable recommending new and better ways to do things to the team.
    • You need to be very comfortable collaborating with data analysts, data scientists, software engineers as well as talking to product managers and other business stakeholders and explaining data concepts.
    • Attention to detail and strong problem-solving skills.
    • Tertiary qualification in Computer Science or similar relevant field.

Bonus/Nice to have:

    • Experience with public cloud platforms, preferably Azure, is highly regarded.
    • Experience with any of the BI reporting tools e.g., Power BI, Sigma, Qlik, Tableau.
    • Working knowledge of Apache Spark and Databricks platform is a plus.
    • Experience with Git or a similar version control/source code management tool is desired.
    • Exposure CI/CD concepts.
    • Snowflake certification.

A 100% remote working environment or on-site in a state of the art mid CBD building.