
Contract Analytical Data Engineer
The Motley Fool is looking for a Freelance Analytical Data Engineer to help us prototype and build a new data product powered by exhaust data. This is an independent contract role at 40 hours per week for at least 3 months, and is best suited for a mid-to-senior level engineer with 4–5+ years of relevant experience.
Who are we?
We are The Motley Fool, a purpose-driven financial information and services firm with nearly 30 years of experience focused on making the world smarter, happier, and richer. But what does that even mean?! It means we’re helping Fools (always with a capital “F”) demystify the world of finance, beat the stock market, and achieve personal wealth and happiness through our products and services.
The Motley Fool is firmly committed to diversity, inclusion, and equity. We are a motley group of overachievers that have built a culture of trust founded on Foolishness, fun, and a commitment to making the world smarter, happier and richer. However you identify or whatever winding road has led you to us, please don't hesitate to apply if the description above leaves you thinking, "Hey! I could do that!"
What does this team do?
The Data Engineering team at The Motley Fool creates data pipelines to wrangle data from around the Fool. We collaborate with everyone—from third-party vendors to internal stakeholders—to build consumable data structures for reporting and business insights. While this role will collaborate with the broader data team, it is dedicated solely to prototyping a new data product, not to supporting ongoing business-line needs.
What would you do in this role?
This role is dedicated to the creation of a new data product, built from internal exhaust data. You'll work closely with our data and product teams to design, build, and document scalable data pipelines, ensure data quality and compliance, and package datasets in a way that supports future commercialization. This is a focused opportunity to shape a product from the ground up. It is not a generalist engineering role across our broader data stack.
But What Would You Actually Do in this Role?
- Build focused data pipelines and processing workflows to support the new data product
- Query and analyze large datasets in Snowflake using SQL
- Debug and resolve data pipeline issues throughout the prototyping lifecycle
- Develop workflows using Airflow (including custom operators) for scalable, modular data orchestration
- Design and prototype data products using internal exhaust/event data, working closely with stakeholders to identify monetizable opportunities
- Package and deliver datasets via Snowflake shares, APIs, or other external-facing interfaces
- Ensure data quality, governance, and privacy requirements are met for external use
- Implement data lineage and usage tracking to inform future improvements
- Document architecture, schema definitions, and user-facing metadata
- Collaborate across product, data, and business to validate product direction and dataset usability
- Stay current on tech trends and best practices in data product development and externalization
Required Experience:
- Strong experience building data pipelines, ideally with Snowflake and Airflow
- Experience with Snowflake Cortex and LLMs
- Proficiency in SQL, including multi-table joins, CTEs, and window functions
- Experience developing in Python, particularly for REST API ingestion and data manipulation
- Familiarity with AWS services including Lambda, S3, and IAM
- Experience building and orchestrating pipelines in Airflow, including DAG design and task management
- Experience with Snowflake external stages, data shares, RBAC, and Snowpark-based task automation
- Ability to work independently, communicate clearly, and deliver results with minimal supervision
- Experience working with complex and time series datasets
- Experience designing and building data products for external consumption (APIs, public data shares, customer-facing datasets)
- Experience documenting and packaging datasets for external use (schemas, dictionaries, metadata, user guides)
- Familiarity with data monetization strategies
- Experience working with event tracking data, such as GA4
Nice to Have:
- Experience working with financial data, especially intraday time series
- Personal or professional experience using The Motley Fool’s services
- Familiarity with data quality tools (e.g., Great Expectations, Monte Carlo, Soda)
- Experience with DevOps/IaC tools like Terraform or CloudFormation
- Experience designing with data contracts and managing interface versioning
- Experience working with cross-functional teams, including data governance, product, legal, or compliance, to prepare data for external release
Compensation:
Below is our target compensation range. While we are budget conscious, we’re also eager to find the right person for this role, so if your target is outside of this range, please don’t hesitate to apply and we’d be happy to have a conversation.
Hourly Pay Range
$85 - $100 USD
By applying on this site, you acknowledge that The Motley Fool will be collecting the personal data you provide for our recruiting purposes. Please see our Applicant Privacy Notice for additional information about how we process, transfer, and store your data, including where that data is stored, and about any additional privacy rights you may have based on your jurisdiction.
Apply for this job
*
indicates a required field