Back to jobs
New

Senior Data Engineer (Azure)

Bulgaria; Czechia; Moldavia; Romania

 

Hello, let’s meet!

Who We Are

While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we’re a team of 1,000+ experts delivering top-notch work across cloud, data, and software. And we’re just getting started.

What We Do

We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data and AI solutions, and cutting-edge applications to shape the future of tech. Our clients include McLaren, Aviva, Deloitte, Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, Allegro, InPost, and many, many more.

We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we’re proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland!

Beyond Projects

What makes Xebia special? Our community. We support tech communities, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It’s not just a job. It’s a place to grow.

What sets us apart? 

Our mindset. Our vibe. Our people. And while that’s hard to capture in text – come visit us and see for yourself.

You will be:

  • designing and implementing data ingestion, transformation and storage layers within an Azure lakehouse architecture,
  • building and maintaining data pipelines using Microsoft Fabric (Lakehouse, Data Engineering, Pipelines),
  • translating business and analytical requirements into scalable data models and pipelines,
  • implementing data quality, governance and security standards (GDPR, ISO‑aligned practices),
  • supporting analytical and reporting use cases, including Power BI consumption,
  • collaborating closely with Data Architects, Business Analysts and Delivery Manager in a project‑based engagement,
  • contributing to platform evolution decisions (architecture, cost model, scalability, data sharing),
  • ensuring production‑ready, maintainable and well‑documented data solutions.

Your profile:

  • strong experience as a Data Engineer in cloud environments,
  • solid knowledge of Azure data services,
  • experience with lakehouse architecture and modern data platforms,
  • proficiency in SQL and Python,
  • experience building production‑grade data pipelines and data models,
  • good understanding of data governance, security and GDPR requirements,
  • ability to work independently and take ownership of delivered components,
  • strong communication skills and experience working with business and technical stakeholders,
  • hands‑on experience with Microsoft Fabric or Databricks,
  • practical experience using AI-powered assistants (e.g. Claude Code, GitHub Copilot, Cursor) to improve productivity, quality, or decision-making in software delivery.

Work from the European Union region and a work permit are required.

Nice to have:

  • familiarity with enterprise data governance frameworks,
  • experience preparing data platforms for AI / ML use cases,
  • knowledge of data sharing patterns and external data consumers,
  • experience applying GenAI in a more structured way within the SDLC, including defined workflows, prompt patterns, or tool integrations embedded into daily work,
  • interest in and familiarity with emerging AI-driven practices (e.g. agent-based workflows, automation patterns, AI-augmented development), with a willingness to explore and experiment beyond standard approaches.

 

Recruitment Process:

CV review – HR call – InterviewClient Interview – Decision

 

Apply for this job

*

indicates a required field

Phone
Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Rate your proficiency in each of the following on a scale of 1 to 5:

  • Azure
  • Microsoft Fabric
  • Databricks
  • SQL
  • Python

Legend:

1 - Beginner (basic knowledge, limited practical experience)

2 - Junior (some practical experience, still learning)

3 - Intermediate (comfortable using it independently in projects)

4 - Advanced (deep understanding, can optimize and solve problems)

5 - Expert (can mentor others, design complex solutions)

Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...