Back to jobs
New

Data & AI Warsaw Tech Summit 2026: Data Platform Engineer – Build the Backbone of AI

Poland

 

 

 

Data Platform Engineer – Build the Backbone of AI

Capco at Data & AI Warsaw Tech Summit 2026

About Capco Poland

At Capco Poland, we’re not just another consultancy - we’re the spark behind digital transformation in the financial world.

As a global technology and management consultancy focused on financial services, we partner with leading banks, fintechs, and financial institutions to design and deliver next-generation data platforms, AI solutions, and digital ecosystems.

From data strategy and modern data platforms to AI-driven decision systems and GenAI innovation, our teams help clients unlock the true value of their data.

Our secret?

A culture that’s fast, flexible, and fiercely entrepreneurial. We move quickly, think creatively, and empower our people to push the boundaries of what technology can achieve.

At Capco Poland, we are proud to be:

Technology partners for leading banks, payments providers, and financial institutions
Builders of modern data platforms and AI-powered systems
Champions of innovation across cloud, data engineering, machine learning, and GenAI
A community of engineers, architects, and consultants passionate about solving complex problems


Meet Capco at the Data & AI Warsaw Tech Summit (21.04 & 22.04.2026) 🚀

At this year’s Data & AI Warsaw Tech Summit, Capco will share how financial institutions can move from experimentation to production-grade AI and scalable data ecosystems.

Our experts will explore how organizations can:

• Build AI-native architectures on modern cloud platforms
• Scale machine learning and generative AI solutions across enterprise environments
• Transform fragmented data into high-value data products
• Embed AI into real business workflows and decision-making systems

Capco Speakers at Data & AI Warsaw Tech Summit 🚀

Andrzej Worona - Head of AI and Data @ Capco Poland & Laura Żusin-Kaczmarek - Data Practice Lead @ Capco Poland

Topic: From Data to Meaning: Educating AI in Banking with Ontologies: Lessons from FIBO and Conversational Banking

Time: 11:50-12:10 CET

 

 

Intro:

Many AI solutions still fall short when it comes to understanding and reasoning about complex financial concepts. The real challenge is about how financial knowledge is represented and shared with machines. Why does AI still misunderstand basic banking terms despite having access to vast amounts of data?
How can AI truly understand financial concepts? Using the Financial Industry Business Ontology (FIBO) as an example of structured domain knowledge, we will discuss how formal, machine-readable definitions can provide the contextual foundation AI needs. By analysing selected conversational banking scenarios and example solutions, we will invite participants to reflect together on what the right semantic layer for AI in banking should look like.
Join us to discover why the next leap in AI for banking isn’t just about more data or better models, but about building a structured understanding of financial meaning.


 

 

We’re Looking for Data Engineers

 

Role Overview

We are looking for a Data Engineer to join our Data & Analytics team. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and architectures. You will work closely with data analysts, data scientists, and business stakeholders to ensure reliable and high-quality data is available for decision-making.

Key Responsibilities

  • Design, develop, and maintain ETL/ELT data pipelines

  • Integrate data from multiple sources (APIs, databases, external systems)

  • Build and optimize data warehouses and data lakes

  • Ensure data quality, consistency, and availability

  • Monitor and improve performance of data processing systems

  • Collaborate with data scientists and analysts to deliver datasets

  • Create and maintain technical documentation

  • Implement and manage cloud-based data solutions (AWS, Azure, GCP)

Requirements

  • Proven experience as a Data Engineer or similar role (3+ years)

  • Strong SQL skills

  • Proficiency in Python or Scala

  • Experience with ETL tools (e.g., Airflow, dbt, Informatica, Talend)

  • Experience with relational and NoSQL databases

  • Familiarity with cloud platforms (AWS, Azure, or GCP)

  • Understanding of data warehousing concepts and data modeling

  • Experience working with large-scale data (e.g., Spark, Hadoop)

  • Strong problem-solving and communication skills

Nice to Have

  • Experience with BI tools (e.g., Power BI, Tableau)

  • Knowledge of DataOps and CI/CD practices

  • Experience working in Agile environments

  • Familiarity with Docker and Kubernetes

 

 

Online Recruitment Process

  • Screening call with the Recruiter
  • Hiring Manager Technical Interview
  • Feedback
  • Offer

We offer a flexible collaboration model based on a B2B contract with the opportunity to work on diverse projects.

 

 

Apply for this job

*

indicates a required field

Phone
Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf


Select...
What is your current location? *
Select...

Do you require reasonable accommodations or adjustments?

If you answered yes to the previous question, please provide additional details.

Select...

Potwierdzenie informacji o polityce prywatności kandydatów do pracy w firmie Capco

Przyjmuję do wiadomości, że podane przeze mnie informacje będą przetwarzane i wykorzystywane do celów opisanych w Polityce prywatności kandydatów do pracy firmy Capco.

Select...

Capco Job Candidate Privacy Notice Acknowledgement 

I acknowledge that the information I provide will be processed and used for the purposes described in Capco’s Job Candidate Privacy Notice.