Back to jobs

Data Engineer

Job description

As a Data Engineer, you will provide data pipeline establishment and enhancement services in the delivery of a wide range of enterprise data and analytics projects on multiple cloud platforms (e.g. AWS, Azure & GCP). You'll be expected to help drive the delivery of cutting-edge data solutions in technology areas such as Big Data & Analytics and Machine Learning.

The Role

This role works closely with other Practice teams, especially Solutions Architects and DevOps Engineers to convert customer needs into functional solutions.

You will work closely with customer technical and business teams, Solution Architecture and Project Managers/Scrum Masters to bring solutions to life, transitioning projects to operational support and/or customer support teams.

As a Data Engineer, you will have the opportunity to mentor and collaborate with engineers and software developers internally and externally to raise and maintain the data engineering standards.

Responsibilities:

  • Work with Solutions Architects to translate data related customer requirements to a fit for purpose solution design
  • Work with stakeholders to translate a business problem into data centric solutions that display the appropriate concerns required to meet functional, non-functional and commercial concerns (i.e. reliability, scalability, maintainability, cost to deliver etc.)
  • Decompose business problems into a set of testable hypotheses, identifying the likely data assets that would support this evaluation
  • Interactively analyse and manipulates data using a variety of data analysis and data mining assets that would support this evaluation
  • Maintaining and providing production support of corporate data warehousing applications
  • Making recommendations for the collection of new data or the refinement of existing data sources and storage
  • Create and maintain optimal data pipeline architectures
  • Assemble large, complex data sets that meet functional & non-functional business requirements
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using public cloud workloads (AWS/Azure)
  • Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability etc.
  • Build analytics tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics
  • Assist clients and stakeholders with their data-related technical issues and support their data infrastructure needs
  • Produce re-usable components that can be deployed quickly to resolve common customer scenarios

Experience:

  • 3-5 years' experience in Data and Analytics projects
  • Experience supporting multiple customers across multiple platforms
  • Experience working in an Agile environment
  • Data visualisation implementation using tools, such as Tableau, PowerBI, HTML, Java and DSJ3
  • Good understanding with public cloud (Azure/AWS) data workload including, but not limited to: Amazon RedShift, Amazon EMR, Azure Synapse, AWS Glue, Azure Data Bricks, Azure Data
  • Experience with Machine learning and analytics workloads is a preferred including but not limited to Azure ML Studio, Amazon SageMaker
  • Demonstrated experience with regards to ownership of client projects and working from inception to delivery
  • Good understanding of DataOps process-oriented methodologies required to improve the quality and reduce the cycle time of data analytics

Skills, knowledge & abilities:

  • Data modelling and data analysis
  • Experience working in DevOps and Agile environments, as well as continuous integration
  • Strong software engineering and coding skills, ideally in a data intensive environment
  • Ability to communicate complex quantitative analysis in a clear, precise, and actionable manner
  • Ability to strongly advocate technical positions while still appreciating alternative proposals
  • Well versed in the latest trends and techniques for building data pipelines for analytical/machine learning solutions
  • Well versed in data warehousing concepts and issues
  • A solid working knowledge and understanding of data modelling and data warehousing principles
  • Ingenuity and genuine ability in the analysis and solution of complex systems problems
  • Creative & thoughtful "hands-on" team member with a "can-do" approach
  • Communicates effectively with people across a variety of backgrounds and develops constructive working relationships
  • Skilled at diagrammatically representing processes, workflows, and ideas and must be able to prepare, present, and maintain technical documentation to customers and internal staff
  • Prioritises a multitude of workload and competing priorities in a pressured environment
  • Willingness to learn and grow and pro-active approach and initiative

Qualifications & certifications:

Essential

  • None are essential, mostly desirable. The proof is in the ability to deliver.

Desirable

  • For AWS specialists: AWS Certified Data Analytics - Specialty Certification
  • For AWS specialists: Azure certification
  • For Azure specialists: Azure Data Engineer Associate
  • Tertiary qualification in Data & AI

Working from home options available.

Please apply today or get in touch!