
Data Engineer
📊 Senior Data Engineer – Scalable Pipelines for AI Agent Infrastructure
We’re looking for a Senior Data Engineer to design, build, and maintain the data foundation behind our next-generation AI agent platform. You’ll work closely with AI/ML teams to power training, inference, and continuous learning through highly scalable data pipelines and cloud-native architectures.
If you’re passionate about data infrastructure, performance optimization, and driving data quality at scale, we want to hear from you.
🗓 Start date: ASAP
📆 Contract type: Contractor - Indefinite
🌐 Work hours: Monday to Friday, 7.30 am to 4.30 pm PST - 100% Remote
🛠️ What You’ll Be Doing
- Build scalable ETL/ELT pipelines to support AI agent training and inference.
- Integrate data from diverse sources (APIs, databases, real-time streams).
- Design high-performance data models using PostgreSQL and Snowflake.
- Optimize SQL queries for large datasets and data warehouse workloads.
- Implement data validation, cleansing, and monitoring workflows.
- Automate data engineering tasks and contribute to CI/CD pipelines.
- Collaborate with AI/ML engineers and product teams to align data with business needs.
- Document processes, pipelines, and infrastructure decisions clearly.
✅ What You Need to Succeed
Must-haves
- 5+ years of experience in Data Engineering or Data Infrastructure roles.
- Strong command of:
- ETL frameworks: Airflow, DBT, Spark
- Databases: PostgreSQL, Snowflake
- Cloud & DevOps: AWS (S3, EC2), Linux, Docker
- CI/CD: GitHub, Jenkins or similar
- Programming: Python (data-focused scripting)
- Deep knowledge of SQL performance tuning, data warehousing, and data quality practices.
- Experience with data security and managing large-scale database environments.
- Comfortable working in cross-functional teams and agile environments.
Nice-to-haves
- Experience with AWS RDS, MWAA, Lambda.
- Familiarity with vector databases and retrieval-augmented generation (RAG) pipelines.
- Exposure to prompt engineering, LLM workflows, or AI/ML data operations.
- Data Engineering certification (GCP, AWS, or equivalent).
🧭 Our Recruitment Process
Here’s what to expect from our candidate-friendly interview process:
-
Initial Interview – 60 minutes with our Talent Acquisition Specialist
-
Culture Fit – 30 minutes with our Team Engagement Manager
- Technical Assessment - Python Coding + Multiple Choice questions
-
Final Stage – 60 minutes with the Hiring Manager (Technical Interview)
🌟 Why Join Launchpad?
We believe that great work starts with great people. At Launchpad, we offer:
-
💻 Fully remote work with hardware provided
-
🌎 Global team experience with clients in [regions]
-
💸 Competitive USD compensation
-
📚 Training and learning stipends
-
🌴 Paid Time Off (vacation, personal, study)
-
🧘♂️ A culture that values autonomy, purpose, and human connection
✨ Join us and help shape the future of intelligent systems—one data pipeline at a time.
Create a Job Alert
Interested in building your career at Launchpad Technologies? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field