
Back to jobs
Junior Data Engineer
New York
About the role
Teza Technologies is looking for a Junior Data Engineer to join our data team. Data drives systematic trading and is critical to all aspects of the firm's business.
This is a hands-on position on a small team of data engineers with growth potential, as this team will grow rapidly over the next couple of years. The firm is looking for outstanding technical skills, strong attention to detail, and experience architecting and building data platforms.
Responsibilities:
- Work directly with Portfolio Managers and Quantitative Developers to translate business requirements into technical solutions; be a resource to explain dataset details and nuances.
- Expand our data warehouse by designing and adding new sources and functionality; improve robustness, speed and scalability of our systems; manage data entitlements
- Provide innovative data management, analytics and technology input to the team and management.
- Evaluate new tools and technologies suitable for organizing, querying and streaming large datasets.
- Design and build automated systems for data cleansing, anomaly detection, monitoring and alerting.
- Support our production data warehouse as required.
- Develop and maintain strong vendor relationships aligned with our business objectives.
Basic Requirements:
- Proficiency in Python and Unix/Linux for data manipulation, scripting, and automation.
- Strong SQL knowledge and familiarity with NoSQL databases (ideally Postgres and MongoDB), including query optimization and performance tuning.
- Strong understanding of data modeling principles, including both normalization and denormalization techniques.
- Experience with on-premises data infrastructure (e.g., Hadoop).
- Experience with Git version control, collaborative workflows (e.g., Github), and understanding of CI/CD best practices.
- Bachelor’s degree in Computer Science, Data Science or related field.
Nice to have Requirements:
- Financial industry internships are a plus.
- Experience with Java recommended.
- Familiarity with cloud platforms, e.g. AWS or GCP.
- Experience with Apache Airflow or similar workflow orchestration tools.
- An understanding of best practices for data modeling, including data normalization techniques.
- Master’s degree in Computer Science, Data Science or related field.
Benefits:
- Health, visual and dental insurance
- Flexible sick time policy
Apply for this job
*
indicates a required field