Share this Job

Senior Data Engineer

Date: 29-May-2022

Location: SLOUGH, GB

Company: Telefonica S.A.

It matters to us that Team O2 is as diverse as the communities we serve. 

We’re aware that we need more diversity in our senior leadership roles, so we welcome and encourage people from all backgrounds to apply. 

Whoever you are, O2 has a place for you. 
Come join us


Grade: PCGU / VM Level 5

Location: Slough / Flexible

Close date: 30/06/2022


The IT Data Architecture team plays a pivotal role in setting the strategic technical direction of Virgin Media O2 Data platforms, services, and end to end delivery. Exploitation of data is key for both the business and technology strategies.


The Netpulse Team is responsible for the technical design, test, build and Support of O2’s Networks Big Data solution,


The team’s aim is to ensure Netpulse is designed and tested to provide a world class Big Data platform that enables O2 to transform across all technologies and provide our customers with world class services and user experience at the most effective cost to the business.


Ingesting more than 10TB of data per day into the VMO2 Netpulse cluster, it is crucial to ensure this data is in a format that can help our business to provide better user experience and make better investment decisions as well as providing insight to our stakeholders.


Our Designer in the Netpulse team will use cutting edge technologies to design and build robust enterprise solutions using governed Design patterns and frameworks.

•    Designing and implementing highly performant data ingestion & transformation pipelines from multiple sources using Scala Spark, NiFi, HBase
•    Design and develop pipelines for Streaming and Batch processes
•    Assuring end to end data availability and quality
•    SPARK performance/tuning/optimisation
•    Resolving problems in complex data pipelines with multiple technologies
•    Developing scalable and re-usable frameworks for ingestion and transformation of large data sets
•    Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data 
•    Helps to resolve technical problems
•    Working with other members of the project team to support delivery of additional project components
•    Evaluating the performance and applicability of multiple tools against requirements
•    Developing tangible and ongoing standardisation, simplification and efficiency of engineering processes, reviewing and revising continuous improvement opportunities
•    Ensuring that all data acquired is fully described/understood and communicated using appropriate tools
•    Excellent oral and written communication skills for all levels of an organisation


Essential Skills

•    Direct experience of building data piplines using GCP Native technologies and Spark
•    Experience building data warehouse solutions using ETL / ELT frameworks
•    Experience with GCP Native technologies, Apache Kafka, Hive, HBase and Nifi for use with streaming and event-based data
•    Experience working with structured and unstructured data
•    Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
•    Experience of migrating data pipelines to Big Query.
•    Extensive skills in SQL, both at production grade and at analytical level, gained through intensive application in a commercial business environment
•    Ability to capture business requirement and transform it to low level design that can be actioned by a developing team. 
Desirable skills

•    Experience of building large scale data pipelines on at least one Cloud Platform (GCP preferred)
•    Experience of working with Telco data
•    Vendor and stakeholder management experience
•    GCP Big Data Architecture certification
•    Cloud migration experience 
•    Experienced in deploying data solutions and cloud infrastructure via CI/CD pipelines
•    Experienced in deploying Infrastructure as Code (Terraform / Cloudformation etc)
•    Knowledge of REST/Graph APIs and how they can be used in a data environment.
•    Knowledge of Docker/Kubernetes
, and how these can be used to simplify deployments.

We’re looking to pay a great compensation package (depending on experience) for this position. We also offer plenty of extras to sweeten the deal, which could include things like bonuses, life assurance cover, health care and lots of flexible benefits.  
Also, every employee has their personal development supported with a LinkedIn learning account; plus other role specific learning available through our award-winning digital learning platform - O2 Campus. 
We also believe a great work-life balance is important, so we’re open to considering flexible working arrangements. Like to know more, feel free to raise it.  
Join us and we’ll encourage you to be bold every day. So take a deep breath, your career is about to go to exciting new places. 
If you have any questions around the role then please email who will be happy to help.