Back to jobs
New

QE (Scala, Kafka, NiFi, Jason)

India - Pune

 

Job Title: Data QA Engineer (Kafka, NiFi, Scala Ecosystem)

About Us

“Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the  British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. 

WHY JOIN CAPCO?

You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry.

MAKE AN IMPACT

Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services.

#BEYOURSELFATWORK

Capco has a tolerant, open culture that values diversity, inclusivity, and creativity.

CAREER ADVANCEMENT

With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands.

DIVERSITY & INCLUSION

We believe that diversity of people and perspective gives us a competitive advantage.

MAKE AN IMPACT

 

Job Title: Data QA Engineer (Kafka, NiFi, Scala Ecosystem)
 
Experience: 4–7 Years Location: Pune (Hybrid – Client Office)
 
Job Overview
 
We are looking for a Data QA Engineer with strong experience in data pipeline and streaming validation across technologies like Apache Kafka, Apache NiFi, and Scala-based systems. The role requires deep expertise in data testing, ETL validation, and real-time data processing, ensuring high data quality, integrity, and reliability across complex distributed systems.
 
Key Responsibilities Perform end-to-end data testing across batch and real-time data pipelines Validate data ingestion, transformation, and streaming workflows built on Kafka and NiFi Verify JSON payloads, event streams, and message integrity across producers and consumers Design and execute test cases covering data accuracy, completeness, reconciliation, edge cases, and failure scenarios Perform data validation using complex SQL queries across large datasets and data warehouses Validate Scala-based backend processing logic impacting data transformations Monitor and test Kafka topics, partitions, offsets, and message flows Ensure data consistency across upstream and downstream systems Identify, log, and track data quality issues and pipeline defects to closure Collaborate with data engineers, developers, and business stakeholders for requirement validation and issue resolution Participate actively in Agile ceremonies and provide timely, accurate status updates Required Skills & Qualifications 4–8 years of experience in data testing / ETL / data QA roles Strong hands-on experience with Apache Kafka (event validation, topic monitoring, consumers/producers) Experience in Apache NiFi pipeline testing and data flow validation Proficiency in SQL for data analysis and validation Experience working with JSON payloads and event-driven architectures Understanding of Scala-based data processing systems (basic to intermediate level) Strong knowledge of data warehousing and ETL concepts Experience in testing real-time/streaming data pipelines Familiarity with Agile/Scrum practices Key Expectations Strong ownership and accountability in delivering data validation tasks within timelines Consistent participation in daily stand-ups and effective communication of progress and risks Ability to build deep functional and data understanding of systems under test Focus on edge cases, data anomalies, and negative scenarios to ensure robust validation Maintain discipline in status reporting, coordination, and collaboration Good to Have Experience with big data technologies (Spark, Hadoop ecosystem) Exposure to data quality frameworks and automation tools Knowledge of CI/CD pipelines and DevOps practices Domain experience in Banking / Financial Services (BFSI) Familiarity with cloud platforms (AWS/Azure/GCP) Key Competencies Strong analytical and data validation skills Attention to detail in high-volume data environments Proactive communication and collaboration Ability to work in fast-paced, distributed systems

Apply for this job

*

indicates a required field

Phone
Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf


Education

Select...

Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...
Select...

Capco Job Candidate Privacy Notice Acknowledgement 

I acknowledge that the information I provide will be processed and used for the purposes described in Capco’s Job Candidate Privacy Notice.