Back to jobs

PC-AWS Data Engineer

India - Pune

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

 

Job Title: Senior AWS Data Engineering Lead/SME

Location: Pune / Bangalore (Preference Pune)
Job Type: FULL TIME
Experience Level: Senior

 

Job Summary:

We are looking for a highly experienced and motivated Senior AWS Data Engineering Lead to join our growing team. This role blends deep AWS data engineering expertise with strong infrastructure and technical leadership skills. You’ll be responsible for designing and delivering scalable, cloud-native data solutions while working closely with customers and internal teams. Ideal candidates are hands-on builders who excel in clean architecture, automation, and team coordination. 

 

Key Responsibilities:

  • This is a leadership role, leading a growing team of AWS data engineers.
  • Act at lead SME, providing hands on guidance to AWS Data Engineering teams.
  • Translate business and functional requirements into AWS well-architected technical solutions.
  • Own end-to-end solution design, architecture, and delivery of data platforms and pipelines.
  • Provide technical leadership to the team and drive adherence to best practices and standards.
  • Design and implement robust, scalable data pipelines using AWS services such as GlueS3LambdaIcebergAthenaSQS, and EventBridge.
  • Develop large-scale data transformations using PySpark and Python, ensuring efficient processing and performance.
  • Build infrastructure using Infrastructure as Code tools like Terraform and AWS CloudFormation.
  • Implement and maintain CI/CD pipelines with tools like JenkinsGitLab CI/CD, etc.
  • Use Git for version control and manage collaborative code development workflows.
  • Work directly with customers to understand their needs and translate them into technical deliverables.
  • Coordinate across teams to ensure smooth execution, delivery timelines, and system integrations.

 

Required Skills and Experience:

  • 7-10+ years of hands-on experience of AWS data engineering and solution delivery. 
  • Strong expertise with core AWS services including Glue, S3, Lambda, Iceberg, Athena, SQS, and EventBridge.
  • Advanced proficiency in PythonPySpark, and SQL for data processing and ETL workloads.
  • Proven experience in ETL/ELT pipeline development, performance tuning, and large-scale data handling.
  • In-depth knowledge of event-driven architectures and AWS messaging services.
  • Solid infrastructure knowledge and hands-on experience with Terraform and/or CloudFormation.
  • Experience implementing CI/CD pipelines using JenkinsGitLab, or similar tools.
  • Familiarity with Git and collaborative version control workflows.
  • Excellent communication skills with a proven ability to gather customer requirements and convert them into scalable technical solutions.
  • Demonstrated experience leading and coordinating engineering teams in delivery-focused environments.

 

Nice to Have:

  • AWS Certification ( AWS Well-Architected, AWS Solutions Architect, AWS Data Analytics, AWS DevOps).
  • Experience working with modern data lakehouse patterns and optimized data formats (Parquet, Avro, etc.).
  • Familiarity with Docker, ECS, or EKS for containerized deployments.
  • Exposure to monitoring and logging tools (e.g., CloudWatch, Prometheus, Grafana).

 

 

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...

Capco Job Candidate Privacy Notice Acknowledgement 

I acknowledge that the information I provide will be processed and used for the purposes described in Capco’s Job Candidate Privacy Notice.