
Staff Data Engineer
Interwell Health is a kidney care management company that partners with physicians on its mission to reimagine healthcare—with the expertise, scale, compassion, and vision to set the standard for the industry and help patients live their best lives. We are on a mission to help people and we know the work we do changes their lives. If there is a better way, we will create it. So, if our mission speaks to you, join us!
Reporting to the Director of Data Engineering, the Staff Data Engineer serves as a senior technical leader responsible for shaping, scaling, and governing our modern data ecosystem. This role blends architecture, hands-on engineering, platform leadership, and cross functional partnerships to deliver high quality data products that power clinical, operational, financial, and analytical outcomes. Deep experience with Databricks, Python, dbt, and Microsoft Fabric, along with strong fluency in healthcare data and compliance standards, is essential. At its core, you’ll work closely with teams across the organization to deliver governed, high‑quality, analytics‑ready data at scale.
Our Tech Stack: Databricks, Delta Lake, Unity Catalog, Microsoft Fabric (OneLake, Lakehouse, Data Factory), Azure, dbt, Python, PySpark, Spark SQL.
What You’ll Do:
Architecture & Strategy
- Design and evolve a scalable, secure, cloud‑native lakehouse platform leveraging Databricks, Microsoft Fabric (OneLake, Lakehouse, Data Factory), and dbt.
- Define modeling patterns, governance frameworks, and engineering best practices across the data lifecycle.
- Lead design reviews and guide teams in adopting scalable architectural patterns.
- Drive long‑term platform strategy and evaluate emerging technologies.
Hands-on Engineering
- Design and implement batch and streaming data pipelines for healthcare data sources (EHR, claims, HL7/FHIR, APIs, flat files, databases)
- Develop modular ingestion, quality, lineage, metadata, and observability frameworks that scale across domains.
- Produce clean, analytics‑ready datasets and data models for BI, analytics, and machine learning workloads.
- Implement HIPAA‑aligned access patterns and secure handling of PHI.
- Architect Databricks workloads (clusters, jobs, Unity Catalog, Delta Lake) for reliability, performance, and cost efficiency.
- Integrate Databricks and Microsoft Fabric with Azure services and enterprise systems.
Leadership & Collaboration
- Partner with product managers, data scientists, analysts, clinicians, and business stakeholders to translate healthcare data needs into scalable solutions.
- Lead Cross functional initiatives that modernize and unify the organization’s data ecosystem.
- Mentor senior and mid-level engineers; elevate team capability through technical coaching and standards.
- Drive roadmap planning, platform evolution, and long-term data strategy.
- Champion engineering excellence, reliability practices, documentation quality, and governance.
What You’ll Need
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 7+ years of experience in data engineering.
- 2+ years operating in a senior or staff level engineering role.
- Deep hands-on proficiency with Databricks, Spark, Delta Lake, dbt, and Python.
- Proven ability to design and operate largescale cloud data platforms (Azure preferred).on experience with
- Hands on experience with Data Engineering, Data Factory, Lakehouse, OneLake.
- Advanced data platform architecture and Lakehouse design expertise.
- Demonstrated ability to design modular, extensible frameworks and guide the long-term evolution of enterprise data platforms.
- Strong command of distributed data processing and cloud native engineering.
- Experience working in HIPAA regulated environments and handling PHI.
- Healthcare data fluency, including regulated data handling and compliance.
- Technical leadership, mentorship, and influence across teams.
- Strong communication skills with both technical and clinical stakeholders.
- Experience with platform reliability, CI/CD for data pipelines, and infrastructure as code.
- 100% remote (ET or CT work hours preferred)
Preferred
- Experienced in implementing and supporting Epic integrations, leveraging Cogito Cloud and Caboodle data models, and delivering reliable incremental data pipelines from Caboodle/Clarity.
Our mission is to reinvent healthcare to help patients live their best lives, and we proudly live our mission-driven values:
- We care deeply about the people we serve.
- We are better when we work together.
- Humility is a source of our strength.
- We bring joy to our work.
- We deliver on our promises.
We are committed to diversity, equity, and inclusion throughout our recruiting practices. Everyone is welcome and included. We value our differences and learn from each other. Our team members come in all shapes, colors, and sizes. No matter how you identify your lifestyle, creed, or fandom, we value everyone's unique journey.
Oh, and one more thing … a recent study shows that men apply for a job or promotion when they meet only 60% of the qualifications, but women and other marginalized groups apply only if they meet 100% of them. So, if you think you’d be a great fit, but don’t necessarily meet every single requirement on one of our job openings, please still apply. We’d love to consider your application!
Come join us and help our patients live their best lives. Learn more at www.interwellhealth.com.
It has come to our attention that some individuals or organizations are reaching out to job seekers and posing as potential employers presenting enticing employment offers. We want to emphasize that these offers are not associated with our company and may be fraudulent in nature. Please note that our organization will not extend a job offer without prior communication with our recruiting team, hiring managers and a formal interview process.
Create a Job Alert
Interested in building your career at Interwell Health? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field
