Data Engineer (Python / GCP / PySpark / Terraform / Pipelines)

  • Location

    London, England

  • Sector:


  • Job type:


  • Salary:

    £500 - £600 per day + negotiable

  • Contact:

    Martin Kettlewell

  • Contact email:

  • Job ref:


  • Published:

    3 months ago

  • Duration:

    6 months

  • Expiry date:


  • Startdate:


  • Consultant:


Data Engineer (Python / GCP / PySpark / Terraform / Data Pipelines) required by my client, an eCommerce brand based in central London

This is a 3 month contract paying up to £600/day and is highly likely to extend. This will be a remote interview and a remote start but is on site when normality resumes

As this global eCommerce brand look to move from AWS over to GCP they need a Data Engineer with strong python coding skills and experience in both AWS and GCP

Day to day jobs and experience expected
- Develop the backend data systems designed to support seamless integration, deployed across 20 countries, integrating thousands of partner brands using spark and python for processing and parquet with Athena/Presto as a query engine
- Develop data pipelines
- Developed the architecture for a distributed crawler operating using server-less AWS lambda functions along with SQS and SNS that execute concurrently across all countries and retailers.
- Work extensively with the search infrastructure based on Elastic Search to improve its indexing and query performance, scalability and relevance of results
- Assisted with the migration from on-premise infrastructure to GCP using Big Query, Dataflow and Dataproc
- Introduce a Data Warehouse for a product database to lay the data in dimension and fact tables that allowed the development of traditional BI analysis on the pricing data

Tech stack
AWS, GCP cloud technologies

This is an opportunity to join one o the fastest growing brands in the country, work with some seriously talented people on vastly interesting projects.

If you would to understand more please get in touch