Back to jobs

Data Engineer

Inizio, the world’s leading  healthcare and communications group providing marketing and medical communications services to healthcare clients. We have 5 main divisions within the group Medical, Advisory, Engage, Evoke and Biotech. Our Medical Division focuses on communicating evidence on new scientific and drug developments and educating healthcare professionals and payers on the appropriate use of therapy.

 

We have a fantastic opportunity for a Data Engineer to support the build of  AI capabilities across Inizio Medical.

 

Key Responsibilities

  • Build scalable and efficient data pipelines.
  • Design the Data Architecture (including data models, schemas, and data pipelines) to process complex data from a variety of data sources. 
  • Build and maintain the CI/CD infrastructure to host and run data pipelines.  
  • Build and maintain data APIs 
  • Setup, support, interact with and maintain AI components including generative and machine learning models. 
  • Build mechanisms for monitoring the data quality accuracy to ensure the reliability, and integrity of data.
  • Evaluate and make technical decisions on the most suitable data technology based on business needs (including security, costs etc) 
  • Collaborate with Data Scientists, Data Analysts, Software development and other stakeholders to understand data requirements.
  • Work closely with System Admins and Infrastructure teams to effectively integrate data engineering platforms into wider group platforms. 
  • Be cognisant of new and emerging technologies related to data engineering, be active champion of data engineering . 
  • Monitor and optimise performance of data systems, troubleshoot issues, and implement solutions to improve efficiency and reliability.

 

To succeed:

  • A strong proficiency in Python 
  • Experience working with Generative AI models, their deployment and orchestration.   
  • A solid understanding of database technologies and modelling techniques, including relational databases, NoSQL databases
  • Experience with setting and managing Databricks environments
  • Competent working with Spark 
  • Solid understanding of data warehousing modelling techniques
  • Competent with setting up CI/CD / Devops pipelines. 
  • Experience with the cloud platforms Azure and AWS and their associated data technologies is essential
  • Experience and understand of graph technologies and modelling techniques is desirable
  • Experience of GCP, Scala desirable
  • Excellent communication skills, capable of explaining complex data/technical concepts to stakeholders with varying levels of technical awareness.  
  • Ability to work collaboratively. 

 

In addition to a great compensation and benefits package including private medical insurance and a company pension, we are happy to talk dynamic working. We are also known for our friendly and informal working environment and  offer excellent opportunities for career and personal development.

 

 

Don't meet every job requirement? That's okay! Our company is dedicated to building a diverse, inclusive, and authentic workplace. If you're excited about this role, but your experience doesn't perfectly fit every qualification, we encourage you to apply anyway. You may be just the right person for this role or others.

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf