Job title: Data Engineer
Job type: Permanent
Emp type: Full-time
Salary type: Annual
Salary: negotiable
Location: London
Job ID: 41736

Job Description

More about the role:

  • use experience in developing data pipelines to deliver data capability.
  • Data of high quality that is appropriately governed is a must for data outcomes your role will support, and thus you will deploy your knowledge and skills to create quality data assets and products
  • A focus on developing data solutions, whether that be onboarding data or transforming data into different normalisation patterns for the consumption of business user

Key responsibilities:

  • Technical Proficiency: Possess the technical aptitude to provide data engineering capabilities
  • Group: at ease in a self-organizing setting where you must continuously decide how to approach data problems with minimal guidance
  • Product Vision: You will comprehend the data vision established by the Product Owners, Technical Leads, and Product Owners, and make sure that the data functionality you design is in line with the broader goal.
  • Principles of Architecture: Recognising the fundamentals of data architecture and how they relate to your own work
  • Guidelines and Optimal Methods: Making sure that, whenever feasible, your development operations adhere to standards and best practices, such as data governance guidelines and data cataloguing
  • Coding: Writing code with a high degree of proficiency that is suitably organised and optimised
  • Data modelling: Suggested data modelling techniques must be followed, and you will be able to ensure that data capability complies with established modelling methods by having a thorough understanding of the various techniques used in the data platform.
  • Vendor Products: Recommend which features and approaches work best in certain scenarios based on your in-depth knowledge of the platform providers.
  • Best Practices for Agile: Encourage and mandate that the product teams implement agile best practices, technologies, and procedures while adhering to the Scrum framework's tenets.
  • Inter-functional Working together: To guarantee that everyone is aware of the needs, priorities, and vision of the product, collaborate with stakeholders, development teams, and product owners.
  • Continuous Improvement: Through retrospectives, knowledge exchange, and agile coaching, actively look for possibilities for continuous improvement at the team and organisational levels.
  • Coaching: Arrange frequent training sessions to advance colleagues' skills as you mentor other Data Engineers or business users who want to leverage data capabilities.
  • Training and Development: taking part in courses to advance your knowledge and abilities

Key requirements:

  • A solid understanding of cloud data tooling, whether it be ELT/ETL tooling, or enterprise data platforms, is essential in order for you to develop data functionality on the data platform.
  • Strong SQL skills is essential in order for you to build data pipelines and data stores.
  • Python knowledge is advantageous.
  • A strong communicator who has the ability to effectively convey complex concepts and influence stakeholders at all levels.
  • Strong analytical and problem-solving skills

Tech stack includes:

  • Infrastructure on AWS. Main components including S3, EC2, Lambda, Airflow
  • Terraform is the Infrastructure as code language
  • Data platform is SnowFlake
  • Snowpark and Sagemaker for data science
  • GitHub and GitHub Actions for continuous integration and deployment
  • Python and SQL data languages
  • Collibra for Data Cataloguing
  • ER Studio for Data Modelling
  • Matillion and Dbt for ELT/ETL
  • Tableau for data visualisation and dashboarding

 

 

 

File types (doc, docx, pdf, rtf, png, jpeg, jpg, bmp, jng, ppt, pptx, csv, gif) size up to 5MB