Senior Data Engineer
Interested in working on cutting-edge blockchain technology and creating equitable access to the global financial system? Since 2014, the mission-driven team at the Stellar Development Foundation (SDF) has helped fuel the tremendous growth of the Stellar blockchain network, an open-source platform that operates at high-scale today. Developers and companies around the world build on it, and the SDF team is expanding to support the rapidly growing and changing Stellar ecosystem.
SDF is looking for a talented, experienced, hands-on Data Engineer to join our team. In this role your primary focus will be all things data. You’ll be designing, building, and implementing data pipelines in our publicly available analytics dataset, Hubble.
As a member of our team, you’ll have the opportunity to work on a wide variety of projects that measure and assess the Stellar Network’s real world impact on the global financial system. This includes monitoring and understanding Soroban (smart contracts) adoption, tracking liquidity and transactional volume through the network, and building analytics products to help smart contract developers. Our aim is to make data more accessible, actionable, and easier to use for the broader Stellar ecosystem.
In this role, you will:
- Design, build, orchestrate, and maintain data pipelines that give us a unique perspective into the liquidity, adoption, and usage of the Stellar network.
- Conduct ad hoc data analysis to clean, transform, and distill key insights about the Stellar Network.
- Improve observability and maintainability of our data marts, including usage, quality, and freshness.
- Translate business priorities and community requests into data products that aid in data-driven decision-making.
- Improve data accessibility by encouraging self-service adoption of Dashboards, KPIs, and SQL Interfaces.
You have:
- 5+ years of professional data engineering, ML, or software engineering experience
- Solid understanding of modern data warehousing concepts and hands-on experience in an ETL framework like dbt, Fivetran, Databricks, or Talend
- Demonstrated ability to analyze large data sets using SQL to validate data and provide insights
- Expertise with ETL schedulers such as Apache Airflow, Dagster, AWS Glue or similar frameworks
- Skilled programming in languages such as Python or Golang
- Excellent communication skills, with the ability to explain complex data insights to non-technical stakeholders
- Enthusiasm about working on a small, growing team with the freedom to innovate and set direction
Bonus points:
- You have cloud development experience, with a preference towards the GCP data stack
- Proficiency with CI/CD pipelines
- Experience administering BI tools and building compelling visualizations
- You have data Governance best practices: data cataloging, quality control, and usability
- You have a strong curiosity in blockchain technologies and cryptocurrencies, and understand the fundamentals of these systems
We offer competitive pay with a base salary range for this position of $160,000 - $205,000 depending on job-related knowledge, skills, experience, and location. In addition, we offer lumen-denominated grants along with the following perks and benefits:
Apply for this job
*
indicates a required field