Share this Job

Data Engineer

Date: 29-Sep-2020

Location: UXBRIDGE, GB

Company: Telefónica S.A.

giffgaff is the commercial brand of Telefónica UK Limited, a leading digital communications company owned by Telefónica S.A. We like to do things a little differently here at giffgaff.

 

We may be a small company, but we like to think big and create some radical waves in the telco land. At the heart of it, we believe in simplicity. A better way to do mobile. We'd rather you stay with us because you want to, not because there's a nasty contract forcing you to. It's why we work our socks off every day to keep you and guess what? It works. We're uSwitch Network of the Year 2019.

 

About the Team:

 

The core mission of Data Engineering team is to deliver the core data infrastructure and data processing pipelines that support giffgaff’s data products and business insights. Our team works horizontally supporting all areas of the business from a data warehousing perspective and has a strong technology focus.

 

Your Role:

 

Reporting into the Data Engineering Technical Lead, the Data Engineer is expected to have deep understanding of data technologies and strong software engineering expertise, along with a deep interest in data analytics, machine learning and AI.

 

The role involves creating data solutions to process data at scale, both in batch and real-time pipelines, to support a wide range of data-driven projects and support our transformation into an AI-ready organisation.

 

Responsibilities:

 

The key responsibilities of the Data Engineer are:

  • Implement workflows to ingest data into a Snowflake data warehouse for a variety of data sources
  • Implement data transformation pipelines in real-time and batch environments
  • Support all product teams in adopting our data engineering tech stack to generate new data streams
  • Collaborate with Data Science and Business Intelligence teams to identify requirements and develop the necessary data workflows to deliver against those requirements

 

Skills & experience:

 

  • University degree in Computer Science, Software Engineering or related subjects or equivilent experience
  • Experienced in Java 8+ and Python
  • Relational and non-relational databases. Experience with Snowflake a plus.
  • Batch processing frameworks, such as DBT, Flink, Apache Airflow, etc.
  • Message brokers / stream processing technologies (Kinesis, Kafka, Storm, Spark Streaming, Flink, etc.)
  • Familiarity AWS, Docker, Kubernetes, Amazon EKS
  • Continuous Integration with Jenkins
  • Test-Driven Development and XP

 

Additional Information:

 

This role involves close collaboration with data scientists, data analysts and product engineers.

 

Note this role was previously advertised as a 12 month FTC but is now a permanent role

 

Grade: MPG4

 

Finally...

This is a chance to work for one of the most sought after UK companies, highly regarded for its community model. In return for your outstanding efforts, you’ll be rewarded with a competitive salary and excellent benefits.

 

We believe that hard work should be supported and recognised. This position plays an important role across the business, allowing you to work cross functionality, take on more responsibility and gain experience, which will greatly benefit you in the future.