
Data Engineering Intern (Remote in Utah)
About your role
As a Data Engineering Intern, you will work with the Business Intelligence team to help extract, transform, and load (ETL) data from both internal and external sources into a Snowflake data warehouse. You’ll gain hands-on experience using tools such as SQL, Airflow, and Google Cloud to orchestrate data flows and support data-driven decision-making across the organization. You’ll collaborate closely with a diverse team of report developers, product owners, and QA staff to help ensure smooth and accurate data operations, and you'll get to experience working in an agile environment.
Reports to: Principal Data Engineering Architect
Location: Remote, but you must be located in CA, AZ, CO, NC, or UT during the internship
Duration: June 16, 2025 - August 18, 2025, working 40 hours per week
How you will make a difference day-to-day
- Data Integration: Analyze, extract, transform, and load data from multiple internal and external sources into a Snowflake warehouse to support data analysis and business intelligence efforts.
- SQL & API Calls: Use SQL and API calls to extract data, and then transform and load it using SQL-based processes.
- Orchestrate Data Flows: Use Airflow in Google Cloud to manage and orchestrate data flows, ensuring seamless data pipeline operations.
- Team Collaboration: Participate in daily team huddles, working alongside other engineers, report developers, and product teams to ensure alignment and progress.
- Collaboration with Report Developers: Work closely with report developers to create useful target tables and views, supporting various business intelligence reports.
- Collaboration with Product & QA Teams: Partner with product owners, release managers, and QA staff to validate and promote code to production.
- Agile Workflow: Update a JIRA Kanban board during certain weeks and a sprint board in others, helping to manage and track progress in an agile environment.
What you’ll need
- Senior Undergraduate Student pursuing a Bachelor’s degree in Computer Science or a related field.
- Proficiency in SQL, Python, Git, and Linux.
- Ability to work collaboratively in a fast-paced, team-oriented environment.
- Familiarity with Google Cloud Platform, Airflow, and Snowflake.
Interview Process:
- Recruiter Phone Screen
- Role Assessment(s)
- Hiring Manager Interview
- Utah, Arizona, North Carolina: $29.60
- California (Outside of San Francisco Bay area) and Colorado: $31.45
Actual compensation packages are determined by various factors unique to each candidate, including but not limited to skill set, depth of experience, certifications, specific work location, and performance during the interview process.
$29.60 - $31.45 USD
By applying for this position, your data will be processed as per Rocket Lawyer Privacy Policy.
Create a Job Alert
Interested in building your career at Rocket Lawyer? Get future opportunities sent straight to your email.
Apply for this job
*
indicates a required field