
Data Engineer
Roadie, a UPS company, is a leading logistics and delivery platform that helps businesses tackle the complexities of modern retail with unmatched delivery coverage, flexibility and visibility. Reaching 97% of U.S. households across more than 30,000 zip codes — from urban hubs to rural communities — Roadie provides seamless, scalable solutions that meet a variety of delivery needs.
With a network of more than 310,000 independent drivers nationwide, Roadie offers flexible delivery solutions that make complex logistics challenges easy, including solutions for local same-day delivery, delivery of big and bulky items, ship-from-store and DC-to-door.
Roadie is seeking a Data Engineer who’s passionate about data and making data-driven decisions to join our growing Data Engineering team. This role will focus on building, maintaining, and scaling our data pipelines/platform and CI/CD processes to support streaming architecture. You will work closely with our business and product owners to create useful, easy-to-consume data for reporting and analytics, as well as have the opportunity to help develop strategy for long term Data Platform architecture. Our data is dynamic across many teams, so a collaborative and communicative spirit is essential to ensuring you’re successful in this role.
What You'll Do
- Build efficient, high quality, integrated and increasingly real-time ETL/ELT pipelines to organize, collect, standardize and integrate data that facilitate more complete, accurate and consistent data for insights and reporting
- Partner with engineering, data science, and business stakeholders to build data insights and help them to achieve their business goals
- Deliver data capabilities and new data products to accelerate investments in data and democratize data across the enterprise
- Provide production support for data tools and data needs across the business
- Research and implement cutting edge solutions to solve challenges related to ETL/ELT, data processing, and analytics
Roadie’s Technology Stack
- Python, Ruby on Rails, Golang
- React/Redux, Objective-C and Swift, Android
- Postgres, Redshift, Redis, Kafka
- AWS
- Docker/Kubernetes
- Prometheus/Thanos/Loki/Grafana
- Github/CircleCI/ArgoCD
- Airflow/Apache Pinot/Trino
- dbt/Superset/Redash/Lightdash
What You Bring
- 4+ years of experience building data infrastructure and platforms using streaming frameworks such as Kafka/Kinesis/Flink/Spark Structured Streaming, etc.
- 3+ years of experience using CI/CD tools to build and deploy projects
- 3+ years of experience working with DevOps tools such as Docker, Kubernetes, EKS, Helm, Terraform, etc to own end-to-end deployment of Data Engineering projects
- 3+ years of experience with one or more language/framework such as Python/Scala/Spark/PySpark/Java/Hadoop
- 4+ years of experience with scaling production systems, ideally in a start-up environment
- 5+ years of experience working with data systems, data warehouse solutions, and ETL/ELT
- 3+ years of experience developing and optimizing dbt models, implementing best practices, and ensuring data quality through testing and documentation. Proficient in version control (Git) and integrating dbt with modern data warehouses (e.g., Snowflake, BigQuery, Redshift, PostgreSQL).
- 3+ years of experience working with workflow management tools such as Airflow/Luigi/Dagster
- 3+ years of experience with PostgreSQL, as well as MPP databases such as Redshift
- Experience implementing data quality checks using dbt tests, custom SQL validations, and data profiling tools. Proficient in designing automated testing frameworks to ensure data accuracy, consistency, and reliability across pipelines.
- Experience with OLAP datastores such as Clickhouse, Apache Pinot, Druid
- Experience with cloud-based architectures on AWS, GCE, or similar
- Experience with various data visualization tools such as Redash/Superset/Tableau/Lightdash/Power BI
- Bachelor's degree in Computer Science or related technical field or equivalent practical experience
Bonus
- Experience working with Databricks/Qubole/Snowflake or similar platforms using Spark/PySpark/Scala
- Application and front end development experience
- Experience with AWS Zero-ETL
Why Roadie?
- Competitive compensation packages
- 100% covered health insurance premiums for yourself
- 401k with company match
- Tuition and student loan repayment assistance (that’s right - Roadie will contribute directly to your existing student loans!)
- Flexible work schedule with unlimited PTO
- Monthly 3-day weekends
- Monthly WFH stipend
- Paid sabbatical leave- tenured team members are given time to rest, relax, and explore
- The technology you need to get the job done
This role is not eligible for visa sponsorship. Applicants must be authorized to work for any employer in the U.S.
Create a Job Alert
Interested in building your career at Roadie? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field