Senior Software Engineer, Datacraft
- We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses.
- We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey.
- We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do.
Become a Senior Data Engineer for Bloomreach!
Join our newly formed Datacraft team — the team building the next-generation data platform that powers internal DWH, analytics dashboards, and the Loomi Analytics Agent for Bloomreach Engagement. Your engineering work will directly impact how hundreds of enterprise customers access, understand, and activate their Bloomreach data internally and with data share in Snowflake, BigQuery, Databricks, and beyond. Your starting salary will be from 4 250 € per month, along with stock options and other benefits. Working in one of our Central European offices (Bratislava, Praha, Brno) or from home on a full-time basis, you'll become a core part of the Engineering team.
What challenge awaits you?
You will join the Datacraft team as one of its founding engineers. Datacraft is a new team in the Engagement pillar, established to tackle three interconnected domains:
- Data Warehouses (~60% of team domain) — making Bloomreach data first-class in customer DWHs (Snowflake, BigQuery, Databricks). The strategic goal for 2026–27 is to use DWHs to exponentially accelerate data adoption — both for customers who want their data outside Bloomreach and for Bloomreach itself to build analytics faster.
- Loomi Analytics Agent (~20%) — evolving Loomi Analytics from a constrained report builder into an agentic analytics assistant that can explore data across systems, explain insights, and eventually act on them. You will help build the data backbone the agent operates on.
- Dashboards & Analytics Stack (~20%) — moving reporting from the proprietary stack onto DWH-backed, modern analytics stacks (semantic layers, headless BI tools), dramatically speeding up how fast we can ship and iterate on dashboards.
Datacraft is an AI-first team. We believe code is a commodity and expect every engineer to fluently use coding agents (e.g., Cursor, Claude Code, Copilot, Gemini CLI) as a core part of their daily workflow. The ability to leverage AI tooling to accelerate development, prototyping, and problem-solving is not optional — it's foundational. As a P3 (Senior) engineer at Bloomreach you are an independent professional — expert in at least one component, able to decompose objectives into tasks, and lead small projects end-to-end with minimal day-to-day guidance.
Your job will be to:
a. Design and build the DWH data platform
- Design and build robust data pipelines that move and transform Engagement data (events, profiles, campaigns, aggregates) into DWHs (BigQuery, Snowflake, Databricks).
- Implement and tune batch and streaming ingestion patterns (e.g., Kafka → GCS → Iceberg -> DWH) with attention to scalability, cost, and reliability, following a medialion architecture principles
- Contribute to data mutation in ETL pipelines ensuring DWH data correctly reflects Bloomreach's ID resolution and consent semantics.
- Own and evolve data models that make Bloomreach data easy to use.
- Build and maintain orchestration and scheduling (Airflow / Cloud Composer) so complex workflows run predictably and are observable.
- Read and interpret data from our monitoring, alerting, and reliability improvements so we catch issues (missing loads, quota limits, data drifts) before customers do.
b. Shape the data layer for Loomi Analytics Agent (~20%)
- Help implement evaluation harnesses and analytics skills (trend analysis, driver analysis, experiment analysis) as skills in the Loomi analytic assistent
- Fine tunning of Loomi system prompts, implementing traces and enabling AI agents federation
- Work with Product and engineering teams to provide clean, well-modeled data interfaces and MCPs for the Loomi Analytics Agentic platform
- Contribute to agentic workflows where Loomi collaborates with DWH analytics/AI (e.g., asking DWH agents for segment performance via MCP, combining results across systems).
- Ensure Loomi's data access patterns are reliable, explainable, and debuggable, minimizing sources of hallucination or data inconsistencies.
- Work with Product and applied AI engineers to provide clean, well-modeled data interfaces and MCPs for the Loomi Analytics Agent platform
c. Co-build the dashboards and analytics stack (~20%)
- Co-design and maintain canonical metrics and models for Engagement reporting when dashboards move from current infrastructure onto DWH-backed stacks.
- Support semantic layers and BI tools (Looker, Power BI, etc.) so that multiple personas (analysts, marketers, data scientists) can reliably self-serve analytics.
- Make sure dashboards and BI assets are tightly integrated with Loomi, so the agent can understand and explain what users see, not just raw tables.
What technologies and tools does the Datacraft team work with?
- Programming languages — Python (primary), Go, SQL
- Messaging & streaming — Apache Kafka
- Databases & storage — BigQuery, Apache Iceberg, Google Cloud Storage (GCS), Mongo, Redis
- Data processing — Apache Spark, DataProc, Airflow / Cloud Composer
- DWH platforms — BigQuery (primary), Snowflake, Databricks (customer-facing)
- Infrastructure — Google Cloud Platform (GCP), Kubernetes, Terraform
- AI / Agentic — LLM APIs, agent orchestration frameworks, MCP, evaluation harnesses
- BI & semantic layers — Cube, Looker Studio
- Observability & operations — Grafana, Prometheus, PagerDuty, Sentry, OpenTelemetry
- Software & tools — GitLab (CI/CD), Jira, Confluence, Kubernetes
- AI coding agents — Cursor, Claude Code
The owned area encompasses domains such as DWH data exports and ingestion, data modeling for analytics, Loomi Analytics Agent data layer, Engagement dashboards on DWH, and semantic layers. Experience with data lakehouse architectures, open table formats (Iceberg), and agentic/LLM systems is highly valued.
Your success story will be:
In 30 Days:
- Gain understanding of company processes, team dynamics, the product, and the Datacraft domain — DWH strategy, Loomi Analytics, dashboards, and key data services.
- Set up your local and GCP development environment and complete the Engagement engineering onboarding.
- Understand the current state of EBQ/data export pipelines, ongoing DWH architecture research, and the Loomi Analytics Agent roadmap.
In 90 Days:
- Deliver your first meaningful contribution to the DWH data platform — a pipeline, data model, or orchestration improvement that ships to production.
- Become comfortable with the end-to-end data flow from Engagement (Kafka, IMF) through to DWH destinations and BI layers.
- Participate in architecture discussions and contribute to key design decisions (KDDs) for DWH architecture, Loomi data access, or dashboard data models.
- Take part in teams onCall/onDuty
In 180 Days:
- Own at least one component or domain within the Datacraft scope — able to independently design, build, and maintain it.
- Be a trusted contributor in your domain — understanding the data platform deeply enough to make informed trade-off decisions and challenge technical proposals.
- Contribute to measurable progress on key team goals: first DWH export customers live, first dashboards on DWH, or first Loomi Agent improvements backed by new data interfaces.
You have the following experience and qualities:
Professional experience
- Solid data engineering background with strong SQL and data modeling skills (star/snowflake schemas, slowly changing dimensions, partitioning/clustering, etc.).
- Hands-on experience building production-grade data pipelines on GCP, ideally involving BigQuery, Apache Iceberg, Apache Spark on DataProc, and Airflow (Cloud Composer).
- Experience with orchestration and workflow tools — specifically Airflow / Cloud Composer — and comfort working with DAG-based systems for scheduled and event-driven jobs.
- Familiarity with open table formats (Iceberg preferred, Delta Lake / Hudi acceptable) and how they interact with query engines and DWH platforms.
- Strong programming skills in Python (preferred); Scala/Java/Go also relevant.
- Fluent use of AI coding agents (Cursor, Claude Code, Copilot, Gemini CLI, or similar) — you should already be using these tools daily to accelerate development, prototyping, debugging, and code review.
- Good understanding of data quality, lineage, and observability (monitoring SLAs/SLOs, detecting missing/late loads, backfilling strategies).
- Ability to work across product and engineering teams, turning fuzzy problem statements into incremental, shippable slices.
Strongly preferred
- Prior experience developing agentic platforms or AI-powered analytics systems — e.g., building agent orchestration, tool-use frameworks, skill registries, or evaluation harnesses for LLM-based products.
- Experience building data access layers that LLMs query (SQL generation, semantic layers as tools, retrieval-augmented generation over structured data).
- Background in marketing/product analytics or customer data platforms: familiarity with events, sessions, funnels, experiments, and lifecycle KPIs.
- Hands-on experience with BI tools and semantic layers (Looker, dbt, etc.), especially where those sit on top of DWHs.
- Previous work on platform-level data exports/imports or DWH integrations, including attention to consent, privacy, and retention.
- Experience with Snowflake or Databricks in addition to BigQuery — our customers use all three.
Personal qualities
- Ownership & accountability — you own problems from first KDD/architecture option through to rollout, instrumentation, and follow-up improvements.
- Product thinking — you naturally ask "who uses this, for what decision, with what constraints?" before designing models or pipelines. Challenging our product leaders is welcomed.
- Collaboration & communication — able to explain trade-offs and constraints clearly to non-engineers, and to document your designs and decisions in a way others can build on.
- Bias for reliability — you care about operational excellence (SLOs, oncall friendliness, proactive alerting) as much as features.
- Continuous improvement mindset — you are comfortable iterating in small, validated steps, and revisiting assumptions when customer usage or data shows a different reality.
- Comfortable operating in a remote-first environment with distributed teams across Central Europe.
#LI-KP1
More things you'll like about Bloomreach:
Culture:
-
A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one.
-
We have defined our 5 values and the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication.
-
We believe in flexible working hours to accommodate your working style.
-
We work virtual-first with several Bloomreach Hubs available across three continents.
-
We organize company events to experience the global spirit of the company and get excited about what's ahead.
-
We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*.
-
The Bloomreach Glassdoor page elaborates on our stellar 4.4/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5
Personal Development:
-
We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions.
-
Our resident communication coach Ivo Večeřa is available to help navigate work-related communications & decision-making challenges.*
-
Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins.
-
Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)*
Well-being:
-
The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.*
-
Subscription to Calm - sleep and meditation app.*
-
We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones.
-
We facilitate sports, yoga, and meditation opportunities for each other.
-
Extended parental leave up to 26 calendar weeks for Primary Caregivers.*
Compensation:
-
Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.*
-
Everyone gets to participate in the company's success through the company performance bonus.*
-
We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts.
-
We reward & celebrate work anniversaries -- Bloomversaries!*
(*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.)
Excited? Join us and transform the future of commerce experiences!
If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful!
Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.
#LI-Remote
Create a Job Alert
Interested in building your career at Bloomreach? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field