Back to jobs
New

Data Engineer

Carmel, Indiana

Summary

Detail-oriented, proactive data engineer to support evolving data architecture and pipeline development initiatives across the organization. Responsible for designing, building, and maintaining scalable, reliable data pipelines, transformations, and integration workflows that enable timely and accurate data delivery to internal and client-facing applications.

 

ESSENTIAL FUNCTIONS

  • Develop, maintain, and monitor ETL/ELT pipelines from multiple internal and external data sources, including carriers and third-party providers.
  • Design and implement data solutions following Medallion Architecture principles to enable robust, layered data pipelines (Bronze, Silver, Gold layers).
  • Manage and enforce Conformed Dimensions across data marts and warehouses to ensure consistency and accuracy in reporting.
  • Optimize SQL queries and data normalization routines to ensure efficient storage and retrieval.
  • Utilize Liquibase for database change management and version control to ensure reliable and repeatable database deployments.
  • Build and maintain data transformation workflows using DBT (Data Build Tool) to improve modularity, testing, and documentation of data models.
  • Collaborate with data scientists, analysts, and developers to translate business requirements into scalable data solutions.
  • Automate data ingestion, transformation, and validation processes to ensure data quality and integrity.
  • Manage data lakes, data marts, and warehouse environments to support analytics and reporting needs.
  • Build and maintain APIs and data connectors for seamless integration between systems.
  • Implement and enforce data governance and security best practices to maintain compliance and data accuracy.
  • Identify and recommend process improvements and automation opportunities within data workflows.
  • Ensure the integrity and confidentiality of all data processed in compliance with company policies and regulations.

 

 

REQUIRED SKILLS

  • Strong proficiency in SQL and experience with ETL/ELT tools and data pipeline frameworks.
  • Hands-on experience with Liquibase for database schema change management.
  • Experience using DBT for data transformation and pipeline orchestration.
  • Proficient in Python for scripting, automation, and data manipulation.
  • Understanding and practical experience with Medallion Architecture for data lakehouse design.
  • Knowledge of Conformed Dimensions for consistent dimensional modeling across data assets.
  • Familiarity with cloud data platforms (e.g., AWS, Azure, or Google Cloud) preferred.
  • Working knowledge of BI visualization tools (e.g., Power BI, Tableau) to support data consumption.
  • Understanding of API usage and integration methods.
  • Problem solver with strong logical and analytical skills.
  • Effective communicator with the ability to collaborate across technical and business teams.

 

REQUIRED EDUCATION/CERTIFICATION

  • Bachelor’s (Master’s preferred) degree in Computer Science, Data Science, Engineering, or related field preferred, or equivalent experience.
  • Experience in insurance, health, or risk management industries a plus.

JOB LOCATION

  • Carmel, Indiana

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf