Back to jobs

Senior Data Engineer

London, UK

Who are we?

Smarkets is a betting exchange for sports and political trading that has handled over £29 billion in volume since 2010. Our company mission is to fix the betting industry with the best products and best prices. We are upending the sports betting industry by growing a transparent platform that offers the best value for bettors, with the fairest odds, best technology and a superior customer experience.

Smarkets is a Series B tech company that brings a professional, product-led approach to our mission. We embrace collaboration, trust, innovation and scientific rigour, while we celebrate ambitious goals and passionate energy. Our culture rewards people on merit and excellence and we strive to provide a working environment where recognition, challenges, support, collaboration, interesting benefits and shared meals provided by our chefs mix together to let you unlock your potential, grow with us and become your best self.

Join our team and play a pivotal role in shaping the future of our betting trading technology landscape.

The Team

The Data Team is responsible for taking the wealth of data that Smarkets generates and using it to drive insights which improve the business. Since Smarkets produces a huge amount of data - including sports event data, payments information, order flow and user analytics - there are many opportunities for the team to add real business value.

The team’s responsibilities currently span across three different areas:

  • Data Engineering: development and maintenance of ETL pipelines, services and APIs, and data-related infrastructure like Redshift or BigQuery;
  • Data Science and Machine Learning: data exploration, ML models training and ML Ops to extract new insights from data;
  • Analytics and Reporting: creation of data models and dashboards as well as automation of reporting pipelines for different teams, stakeholders and third-parties.

In a typical week, a data engineer in the Data team would:

  • Add a new python ETL pipeline that segments users interested in specific sports through analysing behaviour which streamlines and tailors marketing communications to those users;
  • Develop a new endpoint to a Flask API, add unit tests, and deploy the new version of the API into our production Kubernetes cluster;
  • Train and evaluate an ML model to identify certain user patterns and provide it as service to other engineering teams in a Flask API;

Our current technology stack primarily includes Linux, Docker, Kubernetes, Jenkins, Kafka, Python, Flask, Postgres, AWS Redshift, dbt, Google Bigquery, Prometheus, Grafana, Elastic Search, Kibana, Sisense.

Your responsibilities

As a member of the data team, your responsibilities will include contributions to:

  • Developing and maintaining our Data ETL pipelines, some of which are real-time. The pipelines are fundamental to helping teams and stakeholders understand and drive business direction. Data components can also be user facing e.g. sending notifications to users;
  • Ensuring our data lake is kept in a healthy state, particularly our data warehouses: Redshift and Bigquery;
  • Developing and maintaining Flask services and Postgres databases within the Data team to provide access to data or manage certain business entities relevant to Data.
  • Assisting the different teams in the company with reporting, especially when it comes to automated reporting pipelines.
  • Doing data exploration and training & deploying ML models used to perform different kinds of user segmentation, detect operational anomalies or estimate important business quantities. In particular maintain and improve our existing recommender service that provides users with suggestions on new sport competitions.

You will work very closely with the Data team lead and the other team members who will be assisting you whenever needed, making your integration in the company as smooth as possible.

The Data team works in an organised way using Agile methodologies and tools such as Jira and regular standups. You will find an environment where you have a clear engineering direction, can focus on your work and hone your skills as a data engineer through exciting projects. You will always be able to count on the support of many engineers across the company.

Role requirements

  • 4+ years of experience
  • You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow;
  • You have experience with the development of Python-based REST APIs / services and their integration with databases (e.g. Postgres);
  • You are familiar with the key tools of the Python data science stack, e.g. Pandas, Numpy and scikit-learn;
  • Some experience with training and deploying ML models;
  • You enjoy writing elegant, well-tested and maintainable code;
  • You are a team player that enjoys contributing to the success of the team in a proactive and friendly environment;
  • You have a bachelor’s in Computer Science, Math or equivalent field, or possess relevant experience.

Our Values

  • Push to win
  • Make others better
  • Give a shit
  • Be a pro
  • Bring the energy

Our values are at the heart of everything that we do. We believe these are the fundamentals to ensure we are delivering what’s expected of us in the best way possible for ourselves and for those around us.

Benefits

We offer a competitive salary package and benefits, along with a dynamic and collaborative work environment. Your work with us will make an impact and your voice will be heard.
We are a diverse team with a strong work ethic and plenty of hunger to win. We have designed our benefits offering around Health, Wealth, Lifestyle and Development.

These include:

  • Stock options which vests over 4 years
  • Pension scheme - An impressive pension scheme via Aviva. We will match 6% if you choose the same
  • Health insurance
  • Fresh fruit and snacks provided in the office everyday! (tea, coffee, soft drinks also included)
  • We’re a member of a cycle-to-work scheme
  • We want to continue to invest in all our employees and do so by providing a £1000 yearly education budget that can be used on courses, conferences, books or training
  • 25 days paid holiday + bank holidays to enjoy - you have the choice to carry over 5 days to the next year!
  • Flexible working - we want to provide a hybrid model approach to working and provide you with 2 days a week working from home.
  • 20 days a year of global working - we provide the ability to work from anywhere in the world for up to 20 days a year.
  • We will provide you lunch everyday in the office - you don’t need to worry about feeding your belly as we have it sorted with top quality food served by our in-house Chef Alex!

What happens next​

We aim to have a simple and speedy hiring process and we want to make sure that we are right for you as much as the other way around.

  • CV application review - We will review it as quickly as possible
  • HackerRank Exercise - At home test to show off your skills

  • Let’s chat - Quick chat with our team about your experience and the role

  • Technical Interview - Live Coding + System Design Interview

  • In Office Interview - Experience and Mindset Chat + Lunch with other colleagues in the team

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Education

Select...
Select...
Select...
Select...
Select...

Select...
Select...