Back to jobs
New

Senior DataOps Engineer

Melbourne, Australia

Are you a passionate and ambitious Senior DataOps Engineer ready to dive into an environment that fosters innovation, continuous learning, and professional growth? We're seeking talented individuals who are eager to tackle complex big data problems, build scalable solutions, and collaborate with some of the finest engineers in the entertainment industry.

  • Complex Projects, Creative Solutions: Dive into intricate projects that challenge and push boundaries. Solve complex technical puzzles and craft scalable solutions.
  • Accelerate Your Growth: Access mentorship, training, and hands-on experiences to level up your skills. Learn from industry experts and gain expertise in scaling software.
  • Collaborate with Industry Leaders: Work alongside exceptional engineers, exchanging ideas and driving innovation forward through collaboration.
  • Caring Culture, Career Development: We deeply care about your career. Our culture prioritizes your growth with tailored learning programs and mentorship.
  • Embrace Challenges, Celebrate Success: Take on challenges, learn from failures, and celebrate achievements together.
  • Shape the Future: Your contributions will shape the future of entertainment.

About the team

You’ll be joining the Data Engineering team on a mission to maintain a modern, cloud-native data platform that supports business-critical insights and operations. From managing our central data lake to orchestrating real-time pipelines, our team ensures that data flows securely, reliably, and at scale. The DataOps function plays a critical role in this mission, focusing on the overall process and workflows around the data infrastructure, enhancing continuous monitoring and visibility for data, improving system observability, automating governance, and building the foundation for reproducible, scalable data infrastructure.

Key Responsibilities

  • Establish and standardise observability frameworks for data pipelines, ensuring monitoring of system health, data freshness, SLAs, and automated alerting.
  • Proactively monitor and troubleshoot pipeline performance issues, including failures, latency, and reliability bottlenecks, with robust logging, retry logic, and recovery mechanisms in place.
  • Implement and automate data quality validation, including data checks, testing, and lineage tracking, to ensure trust, reliability, and schema consistency across data products.
  • Collaborate with data engineers and domain teams to define and enforce data governance practices, including data SLAs, contracts, access control, encryption, and PII handling.
  • Maintain and evolve CI/CD pipelines and Infrastructure-as-Code (IaC) to ensure reproducibility, scalability, and safe deployment of platform components across environments.
  • Lead incident response processes, including root cause analysis and postmortems, to drive continuous reliability improvements.
  • Own release management workflows for data platform code, including branching strategies, review processes, and approvals.
  • Build and maintain a central data catalog that supports data discovery, trust, and documentation for technical users, enabling self-service access to high-quality data products.
  • Develop reusable infrastructure patterns and standards that empower teams to independently build, deploy, and manage data pipelines and products with consistency and reliability.

Qualifications

  • Bachelor’s degree in Computer Science, Software Engineering, or equivalent practical experience.
  • 4+ years of experience in DataOps, DevOps, or Data Engineering roles with a focus on observability and reliability.
  • Proficiency in Python (or other scripting languages) and SQL skills for data transformation and validation.
  • Experience building and managing data infrastructure in AWS, solid understanding of CI/CD pipelines, and version control.
  • Familiarity with orchestration tools such as Airflow, Dagster, or Prefect.
  • Experience with observability, alerting, and incident response for production data systems.
  • Experience with real-time data streaming tools and patterns (e.g., Kafka, Kinesis).
  • Understanding of compliance requirements in regulated environments (e.g., GDPR, ISO, GLI).
  • Strong collaboration and communication skills.
  • A mindset focused on automation, continuous improvement, and self-service enablement.

Some of the perks of joining us:

  • Championing Engineering Excellence to influence data driven impact across global scale software products.
  • Work alongside the top 5% of engineering talent in Australia using a vast AWS cloud native and big data technology stack.
  • Exposure to building global, large-scale volume data pipelines, data warehouses, and datalakes which are consuming requests at thousands per second frequency.
  • Access to over 9,000 courses across our Learning and Development Platform.
  • EAP access for you and your family.
  • Be rewarded with lucrative annual bonuses.
  • Give back with a paid volunteer day.
  • Fuel your day with daily breakfast and open pantries brimming with unlimited snacks and refreshments, all on the house.
  • Break up the week with on site remedial massage Wednesdays.
  • In house full-time barista’s providing you your daily coffee needs.
  • Weekly team lunches and happy hour in the office from 4pm on Fridays.
  • Enjoy a bustling office with the option for up to 2 days work from home per week.
  • Fun office environment with F1 simulators, table tennis and all your favourite gaming consoles.

We believe that the unique contributions of everyone at Easygo are the driver of our success. To make sure that our products and culture continue to incorporate everyone's perspectives and experience we never discriminate on the basis of race, religion, national origin, gender identity or expression, sexual orientation, age, or marital, veteran, or disability status. We are passionate about providing a workplace that encourages great participation and an equal playing field, where merit and accomplishment are the only criteria for success.

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...
Select...
Select...
Select...

Please note: we allow for flexible start and finish times