Back to jobs
New

Data Engineer (12 month FTC)

Remote

LVS

Data Engineer (12 month FTC - Project running until 2028)

Remote (UK Only)

Hiring Manager: Tamara Carpintero

The Opportunity

Are you ready to make a significant impact on the future of data? We have an exciting opportunity for a Data Engineer to join our Data Engineering team - ideal for experienced professionals who are ready to lead, innovate, and energise those around them. As a 100% data-driven company, we pride ourselves on employing the best engineering practices across our products and solutions. This role offers a broad and varied experience within our Data Engineering function, empowering you to create data-as-product solutions that drive real change.

In this role, you will:

  • Assemble Large, Complex Data Sets: Craft and manage data sets that meet both functional and non-functional business requirements.
  • Build Advanced Data Solutions: Develop the software and infrastructure for optimal data extraction, transformation, migration and ingestion using leading cloud technologies like Azure.
  • Leverage Big Data Technologies: Utilize tools such as FME, Hadoop, Spark, and Kafka to design and manage large-scale data processing systems.
  • Extend & Maintain Data Warehousing: Support and enhance the ETL, Data Lakes, Data Processing and Data Warehouse solutions crucial for Landmark's customers and reporting solutions.
  • Ensure Cost Efficiency: Keep the Data tools and its solutions within agreed cost models and budgets.
  • Lead by Example: Provide thought leadership, inspiring and guiding our team through excellence.

 About You

We're looking for a passionate and motivated Data Engineer — someone who thrives on solving complex problems, inspires others, and brings a positive, proactive mindset to the team.

  • The ideal candidate will possess:
  • Extensive experience in data engineering across both Cloud & On-prem environments
  • Expert knowledge in data technologies, data transformation tools, data governance techniques.
  • Exceptional coding skills.
  • Strong analytical and problem-solving abilities.
  • Good understanding of Quality and Information Security principles.
  • Hands-on experience with Azure (coding, configuration, automation, monitoring, security)
  • Advanced Database and SQL skills
  • Effective communication, ability to explain technical concepts to a range of audiences
  • A Strong grasp of data model design and implementation principles
  • The ability to coach and support less experienced team members
  • Essential skills:
    • ETL Tools such as FME, Azure Data Fabric (ADF), etc
    • Languages such as Python and SQL. - Nice to have: R, Spark, Java, PySpark, Spark SQL, etc
    • Cloud expertise such us Azure Blob storage, CosmosDB, RabbitMQ in Azure, ADF, etc
    • SQL, Postgres or Data warehousing design patterns and implementation
  • Nice to have skills:
    • HDFS, Hadoop, or other on-prem Big Data technologies
    • Familiarity with Salesforce in data integration scenarios
    • Experience with Data Lakes
    • Exposure to Geospatial data is a plus.

 

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...
Select...