Data Architect
Position Overview:
We are looking for hands-on Principal Data-Architect for a challenging and fun filled work of building a data architecture that is future proof for one of the customers in the financial domain.
Roles and Responsibilities:
In this role, owning design and maintenance of all aspects of data solutions including modeling, developing, technical documentation, data diagrams and data dictionaries.
- Provide expertise in the development of standards, architectural governance, design patterns, and practices, evaluate best applicable solutions for different use cases
- Determines and develops architectural approaches and solutions, conducts business reviews, documents current systems, and develops recommendations
- Lead the data strategy and own the vision and roadmap of data products
- Work with stakeholders to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood and considered as part of operational prioritization and planning
- Develop, maintain, and optimize data infrastructure using Delta Lake, MLflow, and Databricks SQL to enhance data management, processing, and analytics.
- Utilize Snowflake’s features such as data sharing, zero-copy cloning, and automatic scaling to optimize data storage, accessibility, and performance. Ensure effective management of both semi-structured and structured data within Snowflake’s architecture.
- Implement and manage data storage solutions using Amazon S3, perform data warehousing with Amazon Redshift.
- Design and implement data integration workflows to orchestrate and automate data movement and transformation.
- Familiarity with Azure Synapse Analytics, Azure Data Lake Storage, and Azure Data Factory.
- Design and implement scalable data pipelines using tools like Apache Kafka or Apache Airflow to facilitate real-time data processing and batch data workflows.
- Apply advanced analytics techniques, including predictive modeling and data mining, to uncover insights and drive data-driven decision-making.
Must-Have Skills:
- Overall experience of 15+ years.
- Data architecture, data platform, data warehouse related experience of 10+ years.
- Hands on experience with snowflake - 4+ years experience, Data bricks 4+ years.
- Between snowflake and Data bricks (5+ years experience).
- Proficiency in features such as Delta Lake, MLflow, and Databricks SQL. Experience in managing Spark clusters and implementing machine learning workflows.
- Solid experience in emerging and traditional data stack components such as: batch and real time data ingestion, ETL, ELT, orchestration tools, on-prem and cloud DW, Python, structured, semi and unstructured databases
- Knowledge of features like Snowflake’s data sharing, zero-copy cloning, and automatic scaling. Experience in working with Snowflake’s architecture for semi-structured and structured data.
- Experience with data services offered by Microsoft Azure Cloud, such as Azure Data Lake, Azure Data Factory, etc.
- Proficiency in tools such as Apache NiFi, Talend, Informatica, or Microsoft SQL Server Integration Services (SSIS).
- Experience in designing and implementing data pipelines using tools like Apache Kafka or Apache Airflow.
Ability to perform data profiling, data quality assessments, and performance tuning.
- Experience in comparing and evaluating different data technologies based on criteria like performance, scalability, and cost.
- Skills in applying advanced analytics techniques, including predictive modeling and data mining.
- Expert with industry standard data practices, data strategies and data concepts
- Demonstrated experience in architecting/re-architecting complex data systems and data models.
- Demonstrated experience in overall system design, including database selection and solutioning.
Nice-to-Have Skills:
- Experience with data governance tools such as Collibra or Alation.
- Knowledge of data quality frameworks and standards, such as Data Quality Dimensions (completeness, consistency, etc.).
- Familiarity with tools like Apache Beam or Luigi for managing complex data workflows.
- Awareness of emerging data technologies such as data mesh, data fabric, and real-time data processing frameworks.
- Any experience with AI/ML algorithms are plus for this role
Qualification:
- Bachelor / Masters or equivalent degree in Computer Science or related field.
Location:
- Melbourne,Australia.
About Nomiso:
Nomiso is a product and services engineering company. We are a team of Software Engineers, Architects, Managers, and Cloud Experts with expertise in Technology and Delivery Management.
Our mission is to Empower and Enhance the lives of our customers, through efficient solutions for their complex business problems.
At Nomiso we encourage entrepreneurial spirit - to learn, grow and improve. A great workplace, thrives on ideas and opportunities. That is a part of our DNA. We’re in pursuit of colleagues who share similar passions, are nimble and thrive when challenged. We offer a positive, stimulating and fun environment – with opportunities to grow, a fast-paced approach to innovation, and a place where your views are valued and encouraged.
We invite you to push your boundaries and join us in fulfilling your career aspirations!
We are an equal opportunity employer and are committed to diversity, equity, and inclusion. We do not discriminate on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, disability status, or any other protected characteristics.
Apply for this job
*
indicates a required field