Back to jobs
tags.new

Data Platform Engineer

Remote

Job Overview

Lola Blankets is a fast-growing direct-to-consumer brand behind some of the internet’s most-loved blankets. Our open-source-first data stack (dbt, Dagster, Lightdash, and migrating from Snowflake to MotherDuck) powers key decisions, so we keep it reliable, fast, and actionable. 

We’re hiring a Data Platform Engineer to sit at the intersection of data and engineering—owning the analytics platform foundation while supporting the broader engineering roadmap across product, operations, and integrations. 

On the data side, you’ll partner with our Analytics Lead to own ingestion, transformation, orchestration, and the semantic layer. When a dashboard number looks off, you’ll trace it through Lightdash/dbt/pipelines, find the root cause, and fix it. 

On the engineering side, you’ll work with our Technology & Engineering Lead on integrations, event pipelines, and platform infrastructure, applying a DevOps mindset to environments, deployments, and production reliability. 

We’re a lean, builder team: open-source-leaning, fast-moving, and opinionated. You’ll be expected to bring strong judgment and the execution to match.

Core Responsibilities

Data Platform & Pipeline Ownership 

  • Own our data ingestion layer end-to-end, including completing our migration to open-source ingestion tooling (dlt) and maintaining reliability as the stack evolves 
  • Manage dbt models, tests, documentation, and the semantic layer - the definitions that determine what every metric means across the business 
  • Own Dagster orchestration: scheduling, retries, alerting, and failure handling across all pipeline runs 
  • Keep Lightdash metadata, dimension/measure definitions, and access controls accurate and current 
  • Accelerate data refresh cycles to support near-real-time operational use across the business 

Data Observability & Quality 

  • Build monitoring, failure alerting, and anomaly detection into the stack so issues surface proactively 
  • Chase data through systems when things go wrong: trace why records drop or transform unexpectedly between source and dashboard, and resolve the root cause rather than the symptom 
  • Establish and document data quality standards and lineage practices across the warehouse 

Engineering Support & Integrations 

  • Partner with our Technology and Engineering Lead on platform infrastructure, system integrations, and technical initiatives where data is a core component 
  • Build and maintain reverse ETL pipelines to push warehouse data back into operational tools 
  • Support real-time event pipeline development as new data sources and product surfaces come online 
  • Contribute to A/B testing infrastructure and the systems that support consistent metric definitions across the org 

DevOps & Platform Governance 

  • Own separation of dev and production environments: deployment pipelines, change management, access controls, and release practices 
  • Run a PII audit across the stack and implement data warehouse governance standards 
  • Maintain infrastructure documentation and ensure the platform is operable beyond any single person 
  • Continuously evaluate our platform stack to ensure we're using the right tools - favoring open-source, cost-effective, and maintainable solutions 

Qualifications

  • 3+ years of data engineering or data platform experience - you've owned production pipelines, not just built them in a sandbox 
  • Strong dbt skills: models, tests, sources, exposures, and the semantic layer 
  • Solid Snowflake or equivalent cloud warehouse experience (MotherDuck is where we are likely to land shortly) 
  • Hands-on with a modern orchestration tool (Dagster, Airflow, Prefect, or similar) 
  • Strong Python or Typescript plus SQL - enough to read, debug, and write anything in the stack 
  • DevOps experience: you think in terms of environments, deployments, change control, and what happens when things break in production 
  • Open-source bias - you'd rather build and own something than pay for a managed tool that abstracts away control 
  • Comfortable with GenAI-assisted development: using LLMs as part of your development workflow to move faster and write better code 
  • Comfortable debugging data end-to-end - you can trace a wrong number back through the semantic layer, dbt models, and ingestion pipeline to the source 
  • Works across team boundaries comfortably; this role sits between data and engineering and requires interfacing with leaders from both teams 
  • Works well independently in a lean team with minimal process overhead 
  • Experience in DTC, ecommerce, or a fast-moving consumer business a plus 

 Compensation and Perks

  • 21 days paid vacation + all federal holidays 
  • Full health benefits 
  • 16 weeks of paid birth parent leave available. 8 weeks non-birth parent leave 
  • 55% off Lola Blankets for friends and family 
  • Opportunities for career growth and leadership roles within Lola Blankets 

Apply for this job

*

indicates a required field

Phone
Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf