
Data Engineer
About Workato
Workato transforms technology complexity into business opportunity. As the leader in enterprise orchestration, Workato helps businesses globally streamline operations by connecting data, processes, applications, and experiences. Its AI-powered platform enables teams to navigate complex workflows in real-time, driving efficiency and agility.
Trusted by a community of 400,000 global customers, Workato empowers organizations of every size to unlock new value and lead in today’s fast-changing world. Learn how Workato helps businesses of all sizes achieve more at workato.com.
Why join us?
Ultimately, Workato believes in fostering a flexible, trust-oriented culture that empowers everyone to take full ownership of their roles. We are driven by innovation and looking for team players who want to actively build our company.
But, we also believe in balancing productivity with self-care. That’s why we offer all of our employees a vibrant and dynamic work environment along with a multitude of benefits they can enjoy inside and outside of their work lives.
If this sounds right up your alley, please submit an application. We look forward to getting to know you!
Also, feel free to check out why:
-
Business Insider named us an “enterprise startup to bet your career on”
-
Forbes’ Cloud 100 recognized us as one of the top 100 private cloud companies in the world
-
Deloitte Tech Fast 500 ranked us as the 17th fastest growing tech company in the Bay Area, and 96th in North America
-
Quartz ranked us the #1 best company for remote workers
Responsibilities
We are seeking a highly skilled and motivated Data Engineer to join our dynamic data team. This individual will play a pivotal role in developing and maintaining scalable data pipelines, data models, and critical analytics dashboards. The ideal candidate will possess strong expertise in SQL, DBT, Python, and Business Intelligence tools. Additionally, experience in AI, specifically in prompt engineering, Large Language Models (LLMs), and building data-driven AI agents, would be highly advantageous.
In this role, you will also be responsible to:
-
Design, build, and maintain robust and scalable data pipelines to enable efficient data integration across the organization.
-
Develop and optimize ELT-based data models using SQL and DBT.
-
Collaborate closely with cross-functional teams to understand data needs and translate these into technical implementations.
-
Leverage Python and associated libraries to conduct complex data analysis, generating actionable insights.
-
Create and manage advanced dashboards and reports using BI tools such as Sigma, Looker, Tableau, or Power BI.
-
Ensure data accuracy and integrity through rigorous validation and testing.
-
Partner with cross-functional data teams to maintain compliance with data governance, security, and privacy standards.
-
Continuously explore and adopt innovative tools, technologies, and methods to enhance data efficiency and quality.
-
Develop advanced monitoring and alerting systems for data pipelines and analytics processes.
-
Translate business requirements into structured data models suitable for ad-hoc reporting and analytics.
Requirements
Qualifications / Experience / Technical Skills
-
Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, Statistics, or a related discipline.
-
5+ years of relevant industry experience in data engineering, analytics engineering, or business intelligence roles.
-
Strong proficiency in SQL and Python, with hands-on experience in ETL/ELT development and data warehousing (Star Schema/Snowflake Schema).
-
Demonstrated expertise in using data integration tools (e.g., Workato, FiveTran, Matillion, Informatica).
-
Experience with modern data warehouse platforms such as Snowflake, Redshift, or BigQuery.
-
Solid experience using data transformation tools like DBT or SQLMesh.
-
Proficiency with business intelligence tools including Sigma, Looker, Tableau, or Power BI.
Soft Skills / Personal Characteristics
-
Exceptional analytical skills with meticulous attention to detail and accuracy.
-
Strong interpersonal and communication skills, adept at working collaboratively in fast-paced environments.
Soft Skills / Personal Characteristics
-
Experience or familiarity with AI technologies, specifically prompt engineering, Large Language Models (LLMs), or AI-driven agents.
-
Proven experience developing or deploying intelligent agents or automations using AI frameworks.
-
Knowledge or experience with cloud services such as AWS, Google Cloud, or Azure.
-
Familiarity with automation platforms like Workato.
-
Experience with real-time analytics and data streaming technologies such as Kafka or StreamSets.
For California applicants, the pay for this role begins at $100,000 plus variable, benefits, perks and equity.
(REQ ID: 2041)
Apply for this job
*
indicates a required field