Data Engineer
Ensemble Travel Group is a leading travel agency consortium of top-tier agencies throughout the U.S. and Canada that was established in 1968. Part of Kensington Tours since June 2022, Ensemble provides members with access to exclusive offers, unique hosted tours, partnerships and superior marketing opportunities with best-in-class suppliers, and proprietary travel platforms such as ADX that offers agents instant commission visibility, one click insurance and more to improve efficiencies and earnings. Ensemble maintains offices in Toronto and New York.
Job Description
As the Data Engineer, you will play a central role in building, maintaining, and optimizing Ensemble’s ingestion pipelines and operational data workflows. You will ensure that data flowing from partner systems, internal applications, CRM, and finance platforms is accurate, timely, and ready for reporting and downstream processes.
You will work closely with the Director of Data & Platform Operations to implement scalable ingestion frameworks, improve data quality, support AI-driven automation initiatives, and strengthen the operational stability of Ensemble’s core data ecosystem. This role requires strong hands-on technical skills, attention to detail, and a deep commitment to reliability in a high-volume data environment.
Key Responsibilities
Ingestion Pipeline Development & Maintenance
- Design, build, and automate ingestion pipelines using Python, PySpark, Microsoft Fabric, Azure Data Factory, Synapse, and related technologies.
- Own operational reliability of partner production data, profit-sharing workflows, CRM/finance data feeds, and ADX/HubSpot/GRASP integrations.
- Implement structured logging, monitoring, alerting, and automated validation for ingestion workflows.
- Develop reusable ingestion patterns that support standardization, performance, and scalability.
Data Quality, Transformation & Governance
- Partner with the data team to validate ingestion outputs, resolve discrepancies, and maintain high data quality across all pipelines.
- Create and maintain transformation logic, mapping tables, data models, and reference datasets used by reporting and analytics teams.
- Contribute to governance standards, documentation, and repeatable operational procedures.
Automation & AI-Enabled Processes
- Identify opportunities to automate manual ingestion, validation, and correction tasks using Python, Power Automate, Fabric tools, or AI-driven approaches.
- Work with the Director to implement automated detection, remediation, and process optimization for recurring ingestion challenges.
- Ensure automated workflows are stable, secure, and documented.
Operational Support & Documentation
- Support CRM and finance data flows by developing and maintaining reliable integrations and transformation logic.
- Troubleshoot issues across data pipelines, internal systems, APIs, and partner data sources.
- Collaborate with Platform Engineering to ensure cloud environments and internal tools support data operations effectively.
- Documentation & Process Discipline
- Document ingestion processes, architecture diagrams, schema rules, and operational runbooks.
- Ensure handoffs, workflows, and edge cases are clearly recorded to support continuity and cross-training.
Qualifications
- Strong hands-on experience with Python, PySpark, and SQL for ingestion and transformation.
- Experience with Microsoft Fabric (Lakehouse, Pipelines), Azure Data Factory, Synapse Analytics, or similar cloud-scale ETL tools.
- Familiarity with data architecture concepts, dimensional modeling, and best practices for operational data pipelines.
- Experience with APIs (REST), JSON, XML, and integration patterns across CRM and finance systems.
- Understanding of orchestration, workflow automation, and monitoring tools.
- Exposure to AI/ML-enabled automation or validation tools is an asset.
- 3–5 years of experience in data engineering or ETL development, preferably in an operational or high-volume environment.
- Experience supporting or integrating CRM systems (HubSpot preferred) and/or finance data workflows.
- Demonstrated ability to troubleshoot ingestion issues end-to-end across cloud, application, and data layers.
- Strong analytical and problem-solving skills with a focus on reliability and data correctness.
Cultural Fit
- Highly detail-oriented, structured, and disciplined in operational processes.
- Comfortable working in a fast-paced environment with evolving business needs.
- Low ego, collaborative, and proactive in communicating risks and solutions.
- Excited about building modern data systems and contributing to automation and AI-driven improvements.
We know that our success is dependent on the people who join our team, which is why we recruit the best. Our team is made up of owners; people who are smart, low ego, and who are accountable for their results. We all play a part in the success of the company and are proud of what we do.
We are committed to providing employment accommodation in accordance with the Ontario Human Rights Code and the Accessibility for Ontarians with Disabilities Act. If you require accommodation due to a disability at any stage of our hiring process, please advise us when completing your application.
We thank all candidates for their interest however only those selected for an interview will be contacted.
Create a Job Alert
Interested in building your career at Ensemble? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field
