Back to jobs
New

Senior Data Engineer

Colombia
For more than 20 years, our global network of passionate technologists and pioneering craftspeople has delivered cutting-edge technology and game-changing consulting to companies on the brink of AI-driven digital transformation. Since 2001, we have grown into a full-service digital consulting company with 5500+ professionals working on a worldwide ambition.
Driven by the desire to make a difference, we keep innovating. Fueling the growth of our company with our knowledge worker culture. When teaming up with Xebia, expect in-depth expertise based on an authentic, value-led, and high-quality way of working that inspires all we do.

About the Role
We’re looking for a highly capable AWS Data Engineer to join our growing Data & AI team. You’ll be designing and building robust, scalable data solutions leveraging the AWS cloud
ecosystem—focusing on real-time streaming, event-driven architecture, and microservices. If you thrive in a collaborative environment, have a passion for automation and quality, and believe in clean, testable code, we want to hear from you.

What You’ll Do
Cloud-Native Data Engineering on AWS
  • Design and implement data pipelines using AWS services like Lambda, Kinesis, EventBridge, and DynamoDB.
  • Build scalable APIs and microservices using Python 3.6+, FastAPI, GraphQL, and Pydantic.
  • Work with infrastructure-as-code using AWS CDK to provision cloud resources securely and reproducibly.
  • Monitor and troubleshoot cloud-native data applications using CloudWatch and related observability tools.
Engineering Practices & Collaboration
  • Apply best practices of SDLC, including Test-Driven Development (TDD), Continuous Integration (CI), and Continuous Delivery (CD).
  • Collaborate across teams using GitLab for version control, pipeline automation, and issue tracking.
  • Participate in design reviews, peer code reviews, and mentor junior engineers.
  • Document architecture decisions, API contracts, and operational procedures.

Streaming, Eventing, and Real-Time Processing

  • Develop and support real-time streaming solutions using AWS Kinesis and EventBridge.
  • Implement and manage event-driven data flows and microservices patterns.
  • Integrate streaming data with downstream analytics or storage systems (e.g., S3,
  • DynamoDB)
 
What You Bring
  • 5+ years of professional experience in Python (v3.6+), including familiarity with Pytes or similar frameworks.
  • Deep understanding of core AWS services: Lambda, S3, DynamoDB, EventBridge Kinesis, CloudWatch, and CDK.
  • Proven experience designing APIs/microservices with FastAPI, GraphQL, and Pydantic.
  • Strong knowledge of GitLab workflows and CI/CD practices.
  • Solid grasp of data engineering principles, SDLC, and test-driven development.
  • Experience with real-time data and event-driven architecture.
  • Availability to work thru EST timezone

Nice to have:
  • Knowledge of additional Python tools and libraries for data (e.g., Pandas, boto3)
  • Experience with containerization (Docker) or serverless-first design patterns
  • Exposure to schema evolution, data contracts, or metadata management
 
 

 

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf


Select...
  • Degree in Information Systems, Computer Science, with 4 or more years of experience
  • Deep understanding of AWS Cloud tools and technologies, including but not limited to CDKs, Lambda, DynamoDB, and S3
  • Python v3.9 or higher and Python frameworks, such as Pytest
  • Experience in developing, writing and executing test cases, and deploying code
  • Experience with data engineering, data modeling, real time streaming and/or eventing, and json parsing
  • Experience in building micro services and APIs utilizing Pydantic and Graphql