Back to jobs
New

Senior Data Engineer

Remote

Company Overview:

Archera empowers organizations of all sizes to optimize their cloud costs through unique, short-term, insured GRI (Guaranteed Reserved Instance) and GSP (Guaranteed Savings Plan) commitments. Hundreds of customers rely on Archera's innovative FinTech solutions to increase their cloud ROI while accelerating revenue objectives and service agility.

Archera's unique cloud rate insurance products and free FinOps platform enable teams to accurately predict and plan infrastructure growth, maximize commitment-based savings, and hedge against market demands and evolving cloud service provider offerings and incentives. Archera works closely with Amazon, Microsoft, Google and a broad network of software and services partners to help each customer execute upon a bespoke, financially-prudent cloud growth and success strategy.

Position Overview:

We’re looking for a Senior Data Engineer to build and scale the data backbone of Archera’s cloud cost optimization products. This role focuses on architecting large-scale data pipelines that process cloud billing and usage data across AWS, Azure, and GCP. You’ll work across OLTP and OLAP systems to ensure our data infrastructure is performant, reliable, and purpose-built for FinOps. Your work will enable accurate cost modeling, insights, and automation that directly power Archera’s cloud rate insurance platform.

Key Responsibilities:

  • Architect, optimize, and maintain large-scale ETL pipelines (10TB+/day) for cloud usage and billing data.
  • Improve performance, testing coverage, and reliability across our data infrastructure stack.
  • Design scalable data models and systems to support new product features and analytics requirements.
  • Debug and resolve high-priority data-related issues across ingestion and transformation workflows.
  • Collaborate cross-functionally to deliver data-driven capabilities that power FinOps outcomes.
  • Integrate and monitor new data sources across Snowflake, Postgres, and BigQuery ecosystems.

Qualifications:

  • 5–10 years of experience in backend or data engineering roles, preferably in cloud or SaaS environments.
  • Expert in Python and SQL, with hands-on experience building and scaling complex ETL workflows.
  • Deep knowledge of OLTP (e.g., Postgres) and OLAP (e.g., Snowflake, BigQuery) data systems.
  • Proven ability to process and model large datasets (100M–1B+ rows/day).
  • Familiar with data observability, testing frameworks, and workflow orchestration tools.
  • Bonus: Experience with SQLAlchemy, cloud billing data, or FinOps platforms.

Benefits:

  • Competitive salary with equity plan
  • Opportunities for professional development and advancement within the company
  • Dynamic and collaborative work environment with a focus on innovation and excellence
  • Full Medical, Dental and Vision plans
  • Unlimited PTO

Location:

Hybrid or remote role, preferably in the Greater Seattle Area. Open to candidates across the U.S. and Canada.

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter*

Accepted file types: pdf, doc, docx, txt, rtf