Back to jobs
New

Data Solutions Engineer

Razorpay was founded by Shashank Kumar and Harshil Mathur in 2014. Razorpay is building a new-age digital banking hub (Neobank) for businesses in India with the mission is to enable frictionless banking and payments experiences for businesses of all shapes and sizes. What started as a B2B payments company is processing billions of dollars of payments for lakhs of businesses across India. 

We are a full-stack financial services organisation, committed to helping Indian businesses with comprehensive and innovative payment and business banking solutions built over robust technology to address the entire length and breadth of the payment and banking journey for any business. Over the past year, we've disbursed loans worth millions of dollars in loans to thousands of businesses. In parallel, Razorpay is reimagining how businesses manage money by simplifying business banking (via Razorpay X) and enabling capital availability for businesses (via Razorpay Capital). 

The Role:

We at Razorpay are looking for a Data Solution Engineer to join our growing team. The hire will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The right candidate will be excited by the prospect of optimizing or even re-designing our company’s data architecture to support our next generation of products and data initiatives.

Your responsibilities include:

  • Expanding and optimizing Razorpay's data and data pipeline architecture, and optimizing data flow.
  • Designing, implementing, and operating stable and scalable solutions for moving data from production systems to analytical data platforms and user-facing applications
  • Collaborating with business customers to understand requirements and implement analytical solutions in a fast-paced environment.
  • Designing, creating, managing, and utilizing large datasets for business purposes.
  • Contributing to high-level design with guidance, including functional modeling and module breakdown.
  • Building and executing data modeling projects across various tech stacks.
  • Challenging existing approaches and proposing innovative ways to process, model, and consume data, considering tech stack choices and design principles.
  • Building and integrating robust data processing pipelines for enterprise-level business analytics.
  • Incorporating automated monitoring, alerting, and self-healing features (restartability and graceful failures) into consumption pipelines.
  • Translating business requirements into technical specifications, including facts, dimensions, filters, derivations, and aggregations.
  • Supporting analytics teams in debugging data accuracy and related issues.
  • Communicating effectively with engineering, product, and analytics teams to define key business questions and build datasets to answer those questions.
  • Demonstrating a passion for working with large datasets, integrating them to address business questions, and driving change.

Essential technical skillsets include:

  • Experience with Enterprise Business Intelligence/Data platforms: Sizing, tuning, optimization, and system landscape integration in large-scale enterprise deployments.
  • Programming proficiency: Preferably in Python.
  • Data manipulation and modeling: Experience with Advanced SQL and DBT or other data modeling tools/frameworks.
  • Software development methodologies: Good knowledge of Agile, SDLC/CICD practices, and relevant tools.
  • Performance optimization: In-depth knowledge of performance tuning/optimizing data processing jobs and debugging time-consuming jobs.
  • Data modeling expertise: Proven experience in developing conceptual, logical, and physical data models for EDW (enterprise data warehouse) and OLAP database solutions.
  • Distributed systems understanding: A solid grasp of distributed systems principles.
  • Large-scale data experience: Experience working extensively in a multi-petabyte DW environment.
  • System engineering in product environments: Experience in engineering large-scale systems within a product development context
  • Workflow Orchestration: Experience with tools like Apache Airflow for managing and scheduling complex data pipelines.

Good to have:

  • Streaming Technologies: A good understanding of streaming technologies like Kafka and Spark Streaming would be beneficial.
  • Big Data Ecosystem Experience: Experience with Hadoop, MapReduce, Hive, and Spark, including Scala programming, would be advantageous.
  • Proficiency in Java, Scala
Razorpay believes in and follows an equal employment opportunity policy that doesn't discriminate on gender, religion, sexual orientation, colour, nationality, age, etc. We welcome interests and applications from all groups and communities across the globe.
 
Follow us on LinkedIn & Twitter

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf


Employment

Select...
Select...

Select...
Select...
Select...
Select...