Back to jobs

Senior Data Engineer

About DKatalis

DKatalis is a financial technology company with multiple offices in the APAC region. In our quest to build a better financial world, one of our key goals is to create an ecosystem linked financial services business.

DKatalis is built and backed by experienced and successful entrepreneurs, bankers, and investors in Singapore and Indonesia who have more than 30 years of financial domain experience and are from top-tier schools like Stanford, Cambridge London Business School, JNU with more than 30 years of building financial services/banking experience from Bank BTPN, Danamon, Citibank, McKinsey & Co, Northstar, Farallon Capital, and HSBC.

 

About the Role

We are seeking a hands-on Senior Data Engineer to help us build out and manage our data infrastructure, which will need to operate reliably at scale using a high degree of automation in setup and maintenance. The role will involve setting up and managing the data infrastructure plus building new systems where required. Responsibilities will extend to building and optimizing key ETL pipelines on both batch and streaming data. The ability to work with the teams from product, engineering, BI/analytics and data science is essential. Ownership needs to be taken of data model design and data quality. The individual will also play active roles in ensuring data governance tooling is implemented and policies thereby adhered to.

The individual will also need to be able to work with technical leadership to make well informed architectural choices when required. A high degree of empathy is required for the needs of the downstream consumers of the data artefacts produced by the data engineering team, i.e. the software engineers, data scientists, business intelligence analysts, etc and the individual needs to be able to produce transparent and easily navigable data pipelines. Value should be assigned to consistently producing high quality metadata to support discoverability and consistency of calculation and interpretation. 

Candidates should have a wide set of experience across the following platforms, systems, languages and capabilities:

  • Ideally GCP, but strong experience in another platform such as AWS or Azure will suffice
  • Event streaming platforms such as Kafka
  • Stream analytics frameworks such as Spark, GCP Dataflow, etc
  • Workflow scheduler such as Apache Airflow
  • Cloud data warehouses such as BigQuery, Redshift or Snowflake
  • Fluency in using Kubernetes, Python.
  • Comfortable writing detailed design documents

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf