Back to jobs
New

Senior Data Operations Engineer

Austin, TX

Senior Data Operations Engineer 

Austin, TX (Onsite 4 days per week)

Acrisure Innovation is where human expertise meets advanced technology.

Acrisure Innovation is a fast paced, AI-driven team building innovative software to disrupt the $6T+ insurance industry.  Our mission is to help the world share its risk more intelligently to power a more vibrant economy. To do this, we are transforming insurance distribution and underwriting into a science. 

At the core of our operating model is our technology: we’re building a digital marketplace for risk and applying it at the center of Acrisure, a privately held company recognized as one of the world's top 10 insurance brokerages and the fastest growing insurance brokerage globally. By leveraging technology to push the boundaries of understanding and transferring risk, we are systematically converting data into predictions, insights, and choices, and we believe we can remove the constraints associated with scale, scope, and learning that have existed in the insurance industry for centuries.

Our culture is strong. We are a collaborative company of entrepreneurial, innovative, and talented people who believe in our future. We outthink and out work the competition. We look outside our walls and are energized by our fast-paced trajectory.

Our vision for the future is clear. We have limitless potential to achieve unprecedented success in the insurance industry. To achieve our opportunity, a best-in-class Team must support us.

Learn more about Acrisure Innovation: https://builtin.com/company/acrisure-innovation

The Innovation team’s mission is to unify data across the enterprise to optimize business decisions made at the strategic, tactical, and operational levels of the organization. We accomplish this by building a data lakehouse that powers analytics and reporting platforms, and business processes that provide quality data, in a timely fashion, from any channel of the company and present them in such a manner as to maximize the value of that data for both internal and external customers. 

We’re seeking a skilled Senior Data Ops Engineer to join our Data & Technology team. This person will be instrumental in monitoring and supporting our modern data integration pipelines—primarily built on Azure Data Factory (ADF), Databricks, PowerBI, and SQL Server. While legacy SSIS is still in place, our focus is on modernizing the stack. 

The ideal candidate will proactively troubleshoot ETL jobs, manage data pipelines, help land external data into the Enterprise Data Lakehouse, and ensure seamless performance of integrations. This is a cross-functional role, working closely with Data Engineering, Data Intelligence, and Analytics teams. 

Responsibilities: 

  • Onboard and curate data sources including data preparation/ELT and modeling to enable data consumption by analytics and AI teams
  • Act as a Solution Architect and Technology Leader – ability to make decisions in the face of ambiguity and solve difficult technical problems. 
  • Onboard, Lead, and Mentor junior and mid-level developers.
  • Work closely with Data Source contacts, Analysts, Product and Data Intelligence teams to identify opportunities and assess improvements of our products and services. 
  • Contribute to workshops with the business user community to further their knowledge and use of the data ecosystem. 
  • Produce and maintain accurate project documentation, project planning and presentations. 
  • Collaborate with various data providers to resolve dashboard, reporting and data related issues. 
  • Perform Data benchmarking, enhancements, optimizations, and platform analytics. 
  • Participate in the research, development, and adoption of trends in technology, data and analytics
  • Monitor and troubleshoot nightly ETL pipelines, primarily built in Databricks, Azure Data Factory (ADF), and SQL Server to ensure reliable data flow and minimal downtime. 
  • Resolve data issues, collaborating with BI Analysts, and Engineering teams to address data latency, quality, and integration problems quickly and effectively. 
  • Implement and maintain monitoring and alerting systems using Azure Monitor, Log Analytics, and custom dashboards to proactively catch and address failures. 
  • Administer and optimize SQL Server databases, including indexing, performance tuning, backups, HA availability groups, transactional replication, and resource management to support fast and efficient queries. 
  • Ingest and land external data securely using replication, SFTP, or Azure Private Endpoints, ensuring readiness for EDW integration. 
  • Develop automation scripts to reduce manual work in data validation, pipeline checks, and recovery workflows. 
  • Manage secret rotation and credential handling securely through Azure Key Vault, ensuring SOCS 2 compliance. 
  • Contribute to modernization efforts, migrating legacy SSIS workflows to scalable ADF and Databricks platforms. 
  • Optimize performance and scalability of workflows in Databricks notebooks and SQL Server stored procedures, addressing bottlenecks and improving execution speed. 
  • Participate in root cause analysis and drive incident resolution and continuous improvements through post-incident reviews. 
  • Support data governance and compliance, working with security teams on access controls, auditing, and policy enforcement. 
  • Document operational procedures, and maintain runbooks, playbooks, and support guides to enable consistency, onboarding, and knowledge transfer. 

 

Qualifications & Requirements: 

Bachelor's degree preferred or equivalent experience along with a demonstrated desire for continuing education and improvement. 

Experience: 

  • 8-10 years of hands-on experience working with Azure cloud services, including Data Factory, Databricks, PowerBI, and SQL Server. 
  • Strong hands-on experience with Data Lake & Delta Lake Concepts.
  • Strong hands-on experience with Databricks Unity Catalog and usage in dealing with Delta tables.
  • Ability to Analyze, summarize, and characterize large or small data sets with varying degrees of fidelity or quality, and identify and explain any insights or patterns within them.
  • Experience with multi-source data warehouses   
  • Experience with other cloud environments (GCP) a definite plus.  
  • Experience in data analytics and reporting, particularly with Power BI a plus
  • Hands on experience building logical data models and physical data models 
  • Write SQL fluently, recognize and correct inefficient or error-prone SQL, and perform test-driven validation of SQL queries and their results.
  • Create and share standards, best practices, documentation and reference examples for data warehouse, integration/ELT systems, and end user reporting
  • Apply disciplined approach to testing software and data, identifying data anomalies, and correcting both data errors and their root causes
  • Should be well versed with Key Vault \ create & maintenance and usage of secrets in both Databricks & ADF
  • Should be knowledgeable in Stored procedures \ functions and be able to use them by ADF & Databricks as this is a widely used Practice in Acrisure.
  • Should be familiar with DevOps process for Azure artifacts and database artifacts. 
  • Should be well versed with ADF concepts like chaining pipelines, passing parameters, using APIs for ADF & Databricks to perform various activities.
  • Should be well versed with Agile and Scrum principles and procedures, and working with Jira
  • Proven ability to build, manage, and troubleshoot data pipelines in Databricks and ADF, ensuring reliability and performance. 
  • Deep familiarity with SQL Server administration—from tuning queries and managing indexes to backups and performance optimization. 
  • Strong knowledge of SQL Server security best practices, including patching, maintenance plans, and access control. 
  • Willingness to participate in an on-call rotation for Sev 0 alerts, including occasional weekend support as part of a shared schedule. 
  • Solid understanding of monitoring and observability tools like Azure Monitor, Log Analytics, and Key Vault for secure and visible operations. 
  • Comfortable managing secret rotation and credential security in cloud environments, with a strong awareness of data security best practices. 
  • Versatile in working with various data formats and sources—structured and unstructured—including CSV, JSON, XML, binary column-based formats (Parquet, Arrow), databases, and Azure storage. 
  • Strong grasp of relational and dimensional data modeling, with experience translating models into efficient, scalable systems. 
  • Skilled in debugging, tuning, and enhancing workflows across Databricks notebooks and SQL-based transformations. 
  • Detail-oriented when it comes to documentation, producing clear, actionable runbooks and technical guides. 
  • Collaborative and communicative across functions, able to explain technical issues clearly and work effectively in cross-functional teams. 
  • Adaptable in fast-paced environments, with a proactive approach to ownership and accountability. 
  • Bonus: Experience with IAC (Terraform), Azure Web Applications, DBT or involvement in data governance and compliance initiatives. 

 

Additional Qualifications:

  • Excellent organizational skills with the ability to prioritize, communicate and execute tasks across multiple projects with tight deadlines and aggressive goals. 
  • Expert working knowledge of SQL, Python, Scala and Spark, and demonstrated ability to create ad-hoc SQL queries to analyze data, create prototypes, etc. 
  • Ability to understand complex issues and clearly articulate complex ideas. 
  • Demonstrated ability to champion change, influence, and drive results in a complex organization. 
  • Ability to mentor junior developers and contribute to the growth of the team. 
  • Excellent verbal and written communication skills. 
  • Experience working in a Multi-Cloud Environment a Plus
  • Experience working with dbt a Plus
  • Knowledge of the Insurance Industry or FinTech a Plus



Create a Job Alert

Interested in building your career at Acrisure Innovation? Get future opportunities sent straight to your email.

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...
Select...
Select...
Select...