Back to jobs
New

Data Engineer

Tartu, Estonia

Boku Inc. (BOKU.L) is the leading global provider of local mobile-first payments solutions. Global brands including Amazon, DAZN, Meta, Google, Microsoft, Netflix, Sony, Spotify, and Tencent rely on Boku to reach millions of new paying consumers who do not use credit cards with our purpose-built payment network of more than 300 local payment methods across 70+ countries. Every year, Boku processes over $10 billion in value for our customers. Incorporated in 2008, Boku is headquartered in London and San Francisco and has employees in over 39 countries around the world, including Brazil, China, Estonia, Germany, Ireland, Japan, Singapore, and the UAE. Boku is a truly global company that takes pride in its diversity and thriving equal opportunity workplace.

We are looking for a Data Engineer to join our Data Platform team in Estonia, which develops and maintains Boku’s data infrastructure to ingest, store, transform and deliver data. Boku’s Data Platform enables our internal stakeholders to build the data products by abstracting away much of the data engineering complexity, allowing them to focus on product core business logic.

 

Key Responsibilities:

  • Deployment and Maintenance: Build, test, review, and maintain scalable data pipelines using Dagster, Amazon MSK, Redshift and dbt
  • Solution Design: Design and document solutions with maintainability, scalability, and user experience in mind
  • Task Execution: Independently design and deliver small tasks; deliver medium-sized tasks with assistance from senior engineers
  • Best Practices: Consistently follow engineering best practices and propose solutions/testing strategies
  • Code Review: Ask clarifying questions about design decisions while reviewing others' code
  • Testing and Documentation: Test and document work performed as necessary

 

Key Skills and Competencies:

  • Technical Skills:
    • 3-5 years technical experience with minimum of 2 years as a Data Engineer, and with a strong background as a software developer (ideally in Python)
    • Proficiency in ETLs, relational databases and data warehouses
    • Experience with orchestrating data pipelines and building data testing pipelines
    • Exposure to Kafka, with development experience as a plus
    • Experience with AWS data technologies is a plus
    • Exposure to data observability or metadata tools (e.g. DataHub) is a plus
  • Personal Attributes:
    • Team Player: Work together with the team to solve problems and get work done.
    • Entrepreneurial: Self-starter who is curious and willing to help solve problems both inside and outside of the team.
    • Ownership: Take ownership to deliver work successfully.
    • Business-minded: Understand how the business works and how technology fits in the context of the business.
    • Collaborative & Communicative: Enjoy working in an agile work environment.
    • Big-picture Oriented: Focus on overall goals and outcomes.
    • Creative: Bring innovative ideas and solutions.

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...