Software Development Engineer II - SAP BODS
Job Description:
We are looking for a skilled Data Integration Engineer with strong hands-on experience in SAP BODS and a working knowledge of Azure Databricks to join our data engineering team. The role requires a solid understanding of ETL development, SAP data extraction, and cloud-based data processing frameworks.
Key Responsibilities:
Design, develop, and maintain ETL pipelines using SAP BODS for data extraction from SAP and non-SAP systems
Integrate SAP data into Azure Data Lake / Azure Databricks, ensuring performance and scalability
Work on batch and incremental loads from SAP using best practices (e.g., CDC, delta loads)
Collaborate with cloud engineers and data analysts to design end-to-end data workflows
Optimize performance and maintain data quality during extraction and transformation
Implement logging, monitoring, and error handling mechanisms
Support UAT and production deployments, and document processes
Required Skills:
Strong hands-on experience in SAP BODS (BusinessObjects Data Services) – job design, dataflows, transforms
Good understanding of SAP ECC / HANA data structures and experience in building extractors
Working experience with Azure Databricks, including Spark SQL and Delta Lake
Experience in data integration scenarios involving cloud storage (ADLS Gen2, Blob)
Good knowledge of SQL, data modeling, and ETL performance tuning
Familiarity with Azure Data Factory or any orchestration tools is a plus
Understanding of data quality and governance concepts
Nice-to-Have:
Experience working with Delta Live Tables, Unity Catalog
Knowledge of PySpark or Python scripting
Familiarity with CI/CD pipelines for data engineering
Create a Job Alert
Interested in building your career at Sigmoid (India)? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field