New

Data Architect (GCP)

Bulgaria, Georgia, Hungary, Lithuania, Poland, Romania

We are seeking an experienced Data Architect with strong expertise in Google Cloud Platform (GCP), Dataplex, and Apache-based data pipelining tools. A career in Exadel means you will work alongside exceptional colleagues and be empowered to reach your professional and personal goals.

Why Join Exadel

We’re an AI-first global tech company with 25+ years of engineering leadership, 2,000+ team members, and 500+ active projects powering Fortune 500 clients, including HBO, Microsoft, Google, and Starbucks.

From AI platforms to digital transformation, we partner with enterprise leaders to build what’s next.

What powers it all? Our people are ambitious, collaborative, and constantly evolving.

About the Client  

Our client is a global management consulting firm based in New York, with 60+ offices worldwide. Their 4,500+ professionals deliver strategic solutions for major players in the retail sector.

What You’ll Do

  • Design, implement, and optimize cloud-native data architectures within GCP
  • Leverage Dataplex for data governance, cataloging, and lifecycle management
  • Build and manage Apache-based data pipelines (Beam, Airflow, Kafka, etc.) to ensure efficient, scalable data processing
  • Develop and maintain ETL/ELT workflows with a focus on cloud-based and streaming architectures
  • Define and enforce data governance and compliance best practices across platforms
  • Collaborate with engineering and analytics teams to ensure data availability, reliability, and performance
    Provide expertise in big data processing and enterprise-scale analytics solutions on GCP
  • Stay current with emerging data technologies and recommend improvements to existing data architecture

What You Bring

  • 7+ years of experience in data architecture, database design, and data engineering
  • Strong expertise in Google Cloud Platform (GCP), including Dataplex, BigQuery, Dataflow (Apache Beam), and other GCP-native data tools
  • Experience with Apache-based data pipelining tools (e.g., Beam, Airflow, Kafka, Spark) for scalable data processing and orchestration
  • Expertise in data modeling (conceptual, logical, and physical) for structured and semi-structured data
    Solid knowledge of ETL/ELT processes, data transformation, and integration techniques in cloud environments
  • Strong understanding of data governance, metadata management, and security compliance within GCP
  • Excellent communication and presentation skills — ability to engage with clients, present technical solutions, and translate complex data concepts into clear business insights
  • Proven ability to collaborate with cross-functional teams, including engineers, analysts, and business stakeholders, to ensure data integrity and accessibility

English level

Upper-intermediate

Legal & Hiring Information

  • Exadel is proud to be an Equal Opportunity Employer committed to inclusion across minority, gender identity, sexual orientation, disability, age, and more
  • Reasonable accommodations are available to enable individuals with disabilities to perform essential functions
  • Please note: this job description is not exhaustive. Duties and responsibilities may evolve based on business needs

Exadel Culture

We lead with trust, respect, and purpose. We believe in open dialogue, creative freedom, and mentorship that helps you grow, lead, and make a real difference. Ours is a culture where ideas are challenged, voices are heard, and your impact matters.

Apply for this job

*

indicates a required field

Resume/CV

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf