Scala Developer
Bengaluru
Job Summary
We are seeking a skilled Data Engineer with strong Scala development experience to design, build, and optimize scalable data pipelines and cloud data warehouse solutions. The ideal candidate will have hands-on experience with Scala, Python, SQL, and a modern cloud data stack including Snowflake, AWS, Spark, and Airflow.
Key Responsibilities
- Design, develop, and maintain robust and scalable ETL/ELT data pipelines using Scala, Spark, and Python.
- Develop and optimize SQL queries and scripts for efficient data extraction and transformation.
- Implement dimensional modeling and database design including ER diagrams, DDL, and DML operations.
- Work with cloud data warehouse platforms, primarily Snowflake, leveraging best practices for data storage and retrieval.
- Manage and orchestrate workflows with Airflow and use DBT for data transformations.
- Utilize cloud technologies on AWS and Azure DevOps environments.
- Implement and support data lake and table format solutions including Iceberg.
- Collaborate with data scientists, analysts, and other engineers to ensure data integrity, availability, and performance.
- Participate in code reviews, debugging, and deployment to maintain high-quality code standards.
Must-Have Skills & Experience
- Total professional experience: 3+ years in data engineering or related roles.
- Scala programming: minimum of 1.5 years hands-on experience.
- Python programming: at least 2 years of experience.
- SQL programming: minimum 1 year of experience.
- Strong understanding of dimensional modeling, ER modeling, and executing DDL/DML operations.
- Experience working with Snowflake cloud data warehouse.
- Proficient with cloud platforms, especially AWS, and working familiarity with Azure DevOps.
- Hands-on experience in Apache Spark and Iceberg data lake table format.
- Experience using DBT for data transformations.
- Workflow orchestration using Apache Airflow.
Good to Have:
- Experience with PySpark.
- Knowledge of Redshift or other cloud data warehouses.
- Familiarity with big data ecosystem tools such as Hadoop, Kafka.
Location
- Bengaluru
Create a Job Alert
Interested in building your career at Sigmoid (India)? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field