
Back to jobs
Data Engineer Revenue Operations
Madrid, Spain
We are looking for a Mid-Level Data Engineer to join our Revenue Operations team, responsible for building, scaling, and maintaining data pipelines that support strategic revenue decisions.
This role plays a key part in connecting data across Marketing, Sales, Customer Success, and Finance, ensuring high data quality, reliability, and availability.
The position requires on-site presence in Madrid, with close collaboration across cross-functional teams in a fast-paced and constantly evolving environment.
Responsibilities
1. Data Engineering (Core)
- Design, build, and maintain scalable and reliable data pipelines (ETL/ELT).
- Develop and optimize analytical data models (bronze, silver, and gold layers).
- Ensure data quality, governance, and consistency.
- Monitor pipelines, proactively identify bottlenecks, and resolve failures.
- Work with large volumes of structured and semi-structured data.
2. Revenue Operations
- Integrate data from multiple sources, including:
- CRM systems (e.g., HubSpot)
- Marketing platforms
- Financial and billing systems (SAP)
- Product data sources
- Build datasets to support analysis of:
- Sales funnel and pipeline
- Revenue forecasting
- Recurring revenue (MRR, ARR)
- Churn, retention, and expansion
- Performance metrics for SDRs, AEs, and CSMs
- Support the development of strategic KPIs and metrics for leadership and C-level stakeholders.
- Partner closely with data analysts, RevOps, and business teams.
3. Technology & Tools
- Use Databricks for data processing, transformation, and orchestration.
- Work extensively with advanced SQL and Python.
- Leverage the Google ecosystem, including:
- BigQuery
- Google Cloud Storage
- Google Sheets (automation and integrations)
- Enable BI tools and dashboards (e.g., Looker, Power BI, Tableau).
4. Collaboration & Environment
- Collaborate closely with business teams, translating requirements into technical solutions.
- Participate actively in agile ceremonies (planning, daily stand-ups, reviews).
- Thrive in a dynamic, high-growth, and fast-changing environment.
- Continuously propose improvements in architecture, processes, and performance.
Requirements
- Proven experience as a Mid-Level Data Engineer.
- Strong expertise in SQL (data modeling and performance optimization).
- Solid experience with Python for data engineering.
- Hands-on experience with Databricks.
- Experience with Google Cloud Platform (BigQuery, GCS).
- Previous experience in Revenue Operations, Sales, or Finance.
- Knowledge of SaaS metrics (MRR, ARR, LTV, CAC, churn).
- Strong understanding of:
- ETL / ELT processes
- Data Warehousing and Data Lakes
- Dimensional data modeling
- Experience with version control systems (Git).
Nice to Have
- Experience with BI tools.
- International work experience.
- Fluence in Spanish.
- Advanced English it's good.
Create a Job Alert
Interested in building your career at Blip Global ? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field
