
Lead Data Engineer
What is Cobre, and what do we do?
Cobre is Latin America’s leading instant b2b payments platform. We solve the region’s most complex money movement challenges by building advanced financial infrastructure that enables companies to move money faster, safer, and more efficiently.
We enable instant business payments—local or international, direct or via API—all from a single platform.
Built for fintechs, PSPs, banks, and finance teams that demand speed, control, and efficiency. From real-time payments to automated treasury, we turn complex financial processes into simple experiences.
Cobre is the first platform in Colombia to enable companies to pay both banked and unbanked beneficiaries within the same payment cycle and through a single interface.
We are building the enterprise payments infrastructure of Latin America!
What we are looking for:
We're building a data platform that does three things: powers our internal operations, AI and risk capabilities, and ships data products directly to clients. Right now, we have the foundation. We need someone to turn it into a competitive advantage.
You'll report to the Chief Data Officer and work with a team of +15—data engineers, ML scientists, risk analytics engineers, and PMs. We run an AI center of excellence, and this platform is its backbone.
What would you be doing:
- The current stack — Snowflake, AWS, Terraform, Airflow, dbt, Kafka, ELK, Sigma. It works, but it wasn't built for what's coming: real-time client-facing products at scale. Your first job is to assess it honestly and build a plan.
- Data infrastructure for decisioning — We have a dedicated squad building our decisioning platform. Your job is to give them the foundation: real-time data pipelines, feature serving, low-latency access to the signals they need. You're not building the decisioning logic—you're making sure the data is there when it matters.
- Infrastructure supporting client-facing data products — Real-time APIs, embedded analytics, decisioning capabilities that our clients integrate into their own workflows. Not dashboards—products.
- AI-ML platform layer — Our data scientists need to deploy and monitor models in production without hand-holding. You'll build the infrastructure that makes that possible.
- The team — You'll hire and grow the engineering side as we scale.
What do you need:
Technical:
- 5+ years in data engineering; 2+ leading a team or major workstream
- You've built production data pipelines (Airflow, Kafka, or similar) and know the difference between "works in dev" and "works at scale"
- Real-time systems experience—streaming data, low-latency APIs, event-driven architectures.
- Establish and own CI/CD practices to ensure quality, reliability, and fast delivery of data solutions.
- Own data quality and proactive monitoring, identify surface issues and communicate them clearly before other teams find them.
- Strong SQL and Python. We'll ask you to prove it.
- Strong open table formats knowledge and modern data architecture experience.
- Experience with Snowflake or equivalent cloud warehouse.
- Experience with AWS.
- You've shipped data products to external users, not just internal stakeholders.
Leadership and Business Alignment
- Lead and develop the team, fostering high performance and professional growth.
- Provide ongoing mentorship, feedback, and establish best practices.
- Collaborate closely with business teams to understand priorities and requirements.
- Align technical strategy and execution with business roadmaps across the organization.
- Act as a bridge between business and technology, ensuring focus on impact and value delivery.
- Build processes and documentation that enable teams to self-serve on the data platform, reducing hand-holding and dependency.
- Stakeholder management across engineering, risk, finance, and client success teams
Important but learnable:
- MLOps (model deployment, monitoring, feature stores)
- Fintech/payments domain
- English (team is bilingual)
- Understanding of payment rails, reconciliation, and settlement processes
- Knowledge of treasury operations and liquidity management data
- Experience with financial data regulations (PCI-DSS, data residency requirements in LatAm)
Apply for this job
*
indicates a required field
