Back to jobs
New

Senior Data & Analytics Engineer

Indonesia

Senior Data and Analytics Engineer

 

 

 

 

 

 

 

 

 

 

 

PropHero is an AI-driven marketplace transforming property investment. Backed by global VCs and founded by McKinsey alumni, we’re expanding our team. Join our thriving, and flexible culture, surrounded by ambitious individuals driving change. At PropHero, we’re making property investment as simple as shares or ETFs—be part of the future we’re building!

We are value-driven!

🤝 BELIEVE: We have a contagious passion and entrepreneurial spirit.

🔍 CONNECT: We care for each other and create a “one-team” spirit.

📈 RAISING THE BAR: We push for exceptional performance and never settle for mediocrity.

🔥 OWN IT: We are owners no matter the circumstances.

🌐 DELIVER: We deliver meaningful, measurable outcomes driving a positive impact.

Do these values resonate with you? Keep reading!

How you will shape Prophero:

As a Senior Data & Analytics Engineer at PropHero, you will own the complete data lifecycle - from real-time ingestion to analytics-ready insights. You'll architect event-driven pipelines to stream data from external sources (HubSpot, APIs, webhooks) into PostgreSQL, then transform raw data into dimensional models and actionable dashboards. Working in an AWS ecosystem, you'll build the data infrastructure (Lambda, EventBridge, RDS) while also designing snowflake-modeled data marts and Metabase visualizations. This is a true end-to-end role with equal emphasis on both data engineering (50%) and analytics engineering (50%), where you'll bridge technical infrastructure and business intelligence, ensuring our teams have both reliable data pipelines and clean, business-ready datasets for property valuation models and market analysis.

  • Event-Based Data Streaming: Design and implement event-driven pipelines using AWS services (Lambda, EventBridge, Kinesis/MSK, SQS) to ingest data from external sources in real-time.
  • HubSpot Integration: Build and maintain streaming data pipelines between HubSpot CRM and PostgreSQL, handling webhook events, API polling, and CDC patterns for sub-minute data freshness.
  • External API Integration: Develop robust connectors for third-party APIs, webhooks, and data sources, ensuring reliable data capture with proper error handling and retry logic.
  • AWS Infrastructure Management: Deploy and manage AWS resources (Lambda, RDS, EventBridge, CloudWatch, S3) for scalable data solutions.
  • Monitoring & Alerting: Build comprehensive monitoring dashboards and alerting systems to track pipeline health, data freshness, and error rates.Data Modeling (Snowflake Method): Design and implement dimensional data models in PostgreSQL using snowflake methodology, creating efficient fact tables, dimension tables, and slowly changing dimensions (SCDs).
  • Data Transformation Pipelines: Build SQL-based transformation workflows to convert operational database tables into analytics-ready data marts, ensuring data consistency and business logic integrity.
  • Data Marts Development: Create purpose-built data marts for different business domains (property valuation, customer analytics, market trends) optimized for analytical queries.
  • BI Development: Design, build, and maintain dashboards, reports, and visualizations in Metabase for self-service analytics across the organization.
  • Analytics & Reporting: Perform data analysis to answer business questions, identify trends, and deliver actionable insights to product and leadership teams.
  • Metrics Definition: Partner with business stakeholders to define KPIs, metrics, and business logic; document metric definitions and calculation methods.
  • Data Quality & Validation: Implement schema validation, data type checking, and automated quality gates at both the ingestion layer and transformation layer to ensure data accuracy and consistency.
  • SQL & Database Optimization: Write efficient, performant SQL queries; optimize query performance and database design through proper indexing, query structure, materialized views, and connection pooling.
  • Documentation & Collaboration: Maintain clear documentation of pipeline architecture, data flows, API integrations, data models, transformation logic, and metric definitions; work closely with distributed teams across different time zones.
  • End-to-End Ownership: Take full ownership of data systems from ingestion to insights, ensuring seamless integration between infrastructure and analytics layers.

What you will bring to our team:

  • 6+ years of experience in data engineering and analytics roles with proven end-to-end data pipeline ownership.
  • Strong proficiency in Python for building ETL/ELT pipelines, API integrations, and data validation logic.
  • Expert-level SQL proficiency: Advanced SQL skills including window functions, CTEs, complex joins, and query optimization for PostgreSQL.
  • Hands-on AWS experience with Lambda, EventBridge, Kinesis/SQS, RDS/PostgreSQL, CloudWatch, and S3.
  • Event-driven architecture: Proven experience with event buses, message queues, webhooks, and streaming architectures.
  • Real-time streaming data: Experience with Kafka, Kinesis, or similar streaming platforms; understanding of CDC and real-time data patterns.
  • Snowflake data modeling: Deep understanding of dimensional modeling principles, star/snowflake schemas, fact/dimension tables, and normalization techniques.
  • PostgreSQL expertise: Strong knowledge of PostgreSQL-specific features, indexing strategies, materialized views, performance tuning, and connection pooling.
  • API integration expertise: Strong experience with REST APIs, authentication methods (OAuth, API keys), rate limiting, and error handling.
  • BI tools experience: Hands-on experience with Metabase, Tableau, Looker, or similar BI platforms for dashboard development.
  • Professional English proficiency: Excellent written and verbal English communication skills for technical documentation, code comments, analysis reports, presenting insights, and daily collaboration with Australia and Spain-based team members.
  • Problem-solving: Analytical mindset with ability to debug complex pipeline issues, optimize query performance, and implement robust error recovery.
  • Remote collaboration: Comfortable working remotely with distributed teams across different time zones, able to communicate complex technical concepts clearly in English.
  • dbt experience (preferred): Familiarity with dbt or similar transformation frameworks for building modular, tested SQL pipelines.
  • HubSpot or CRM API experience (bonus): Familiarity with CRM APIs (HubSpot, Salesforce, Zoho, or similar) is a strong plus.
  • PropTech/Real Estate (bonus): Experience with property data, valuation models, real estate or construction project analytics is a plus.

What can we offer you:

🚀 Growth Mindset

We believe in pushing boundaries and embracing challenges. Whether it’s exploring new tools, improving processes, or taking on uncharted projects, we support a culture of curiosity and innovation where learning never stops.

🌎 A Diverse, Collaborative Team

Join a global team that celebrates diversity and thrives on collaboration. With team members from all over the world, we bring together unique perspectives, ideas, and experiences to create meaningful impact and drive innovation.

📚 Continuous Learning & Development

No matter your background or field of study, you’ll find endless opportunities to learn, grow, and contribute. Our open, transparent environment encourages the exchange of ideas, helping you unlock your full potential.

🌟 Career Growth in an Early-Stage Startup

Be part of a fast-growing startup with global ambitions. As we scale, so will your opportunities to take on more responsibility, shape our success, and advance your career.

💰 Competitive Compensation Package

We value your contributions and offer a competitive economic package to match your impact.

 

Diversity Statement

At Prophero, we are committed to fostering an inclusive and equitable workplace where diverse perspectives and backgrounds are not only welcomed but celebrated. We believe that diversity drives innovation and empowers us to build stronger connections with our clients and communities. 

Prophero is an equal opportunity employer and is dedicated to ensuring a hiring process free from discrimination based on race, ethnicity, gender, age, disability, religion, sexual orientation, or any other characteristic protected by law. Our mission is to create a workplace where everyone feels valued, supported, and empowered to achieve their full potential.

Create a Job Alert

Interested in building your career at PropHero? Get future opportunities sent straight to your email.

Apply for this job

*

indicates a required field

Phone
Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf