Back to jobs
New

Azure DevOps

India - Bengaluru

Job Title: Azure Cloud Data Engineer

About Us

Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

Role Description:

Location: Bangalore

Exp: 7 to 10yrs

Utilizes software engineering principles to deploy and maintain fully automated data transformation pipelines that combine a large variety of storage and computation technologies to handle a distribution of data types and volumes in support of data architecture design.  A Senior Data Engineer designs and oversees the entire data infrastructure, data products and data pipelines that are resilient to change, modular, flexible, scalable, reusable, and cost effective.

  • Bachelor’s degree in computer science engineering or related field 
  • Technical Leadership
  • Consulting and managing business needs.
  • Strong experience in Python is preferred but experience in other languages such as Scala Java, C#, etc is accepted.
  • Experience building spark applications utilizing PySpark.
  • Experience with file formats such as Parquet, Delta, Avro.
  • Experience efficiently querying API endpoints as a data source.
  • Understanding of Git workflows in software development.
  • Using Azure DevOps pipeline and repositories to deploy and maintain solutions. Understanding of Ansible and how to use it in Azure DevOps pipelines.

 

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf


Select...
Select...

Capco Job Candidate Privacy Notice Acknowledgement 

I acknowledge that the information I provide will be processed and used for the purposes described in Capco’s Job Candidate Privacy Notice.