
Research Engineer / Scientist (SLAM)
About World Labs:
We build foundational world models that can perceive, generate, reason, and interact with the 3D world — unlocking AI's full potential through spatial intelligence by transforming seeing into doing, perceiving into reasoning, and imagining into creating.
We believe spatial intelligence will unlock new forms of storytelling, creativity, design, simulation, and immersive experiences across both virtual and physical worlds.
We bring together a world-class team, united by a shared curiosity, passion, and deep backgrounds in technology — from AI research to systems engineering to product design — creating a tight feedback loop between our cutting-edge research and products that empower our users.
Role Overview
We’re looking for a SLAM Specialist to design, implement, and advance state-of-the-art simultaneous localization and mapping systems that enable accurate, robust spatial understanding from real-world sensor data. This role is focused on modern SLAM techniques—both classical and learning-based—with an emphasis on scalable state estimation, sensor fusion, and long-term mapping in complex, dynamic environments.
This is a hands-on, research-driven role for someone who enjoys working at the intersection of robotics, computer vision, and probabilistic inference. You’ll collaborate closely with research scientists, ML engineers, and systems teams to translate cutting-edge SLAM ideas into production-ready capabilities that form the backbone of our world modeling stack.
What You Will Do:
- Design and implement modern SLAM systems for real-world environments, including visual, visual-inertial, lidar, or multi-sensor configurations.
- Develop robust localization and mapping pipelines, including pose estimation, map management, loop closure, and global optimization.
- Research and prototype learning-based or hybrid SLAM approaches that combine classical geometry with modern machine learning methods.
- Build and maintain scalable state estimation frameworks, including factor graph optimization, filtering, and smoothing techniques.
- Develop sensor fusion strategies that integrate cameras, IMUs, depth sensors, lidar, or other modalities to improve robustness and accuracy.
- Analyze failure modes in real-world SLAM deployments (e.g., perceptual aliasing, dynamic scenes, drift) and design principled solutions.
- Create evaluation frameworks, benchmarks, and metrics to measure SLAM accuracy, robustness, and performance across large datasets.
- Optimize performance across the stack, including real-time constraints, memory usage, and compute efficiency, for large-scale and production systems.
- Collaborate with reconstruction, simulation, and infrastructure teams to ensure SLAM outputs integrate cleanly with downstream world modeling and rendering pipelines.
- Contribute to technical direction by proposing new research ideas, mentoring teammates, and helping define best practices for localization and mapping across the organization.
Key Qualifications:
- 6+ years of experience working on SLAM, state estimation, robotics perception, or related areas.
- Strong foundation in probabilistic estimation, optimization, and geometric vision (e.g., bundle adjustment, factor graphs, Kalman filtering).
- Deep experience with one or more SLAM paradigms (visual, visual-inertial, lidar, multi-sensor, or hybrid systems).
- Proficiency in Python and/or C++, with hands-on experience building research or production-grade SLAM systems.
- Experience with numerical optimization libraries and/or robotics frameworks.
- Familiarity with learning-based perception or representation learning and how it can augment classical SLAM pipelines.
- Strong understanding of real-world sensor characteristics, calibration, synchronization, and noise modeling.
- Proven ability to work in ambiguous, fast-moving environments and drive projects from concept through deployment.
- A strong sense of ownership and engineering rigor: you care deeply about correctness, stability, and measurable improvements.
- Enjoy collaborating with a small, high-caliber team and raising the technical bar through thoughtful design, experimentation, and code quality.
Who You Are:
- Fearless Innovator: We need people who thrive on challenges and aren't afraid to tackle the impossible.
- Resilient Builder: Impacting Large World Models isn't a sprint; it's a marathon with hurdles. We're looking for builders who can weather the storms of groundbreaking research and come out stronger.
- Mission-Driven Mindset: Everything we do is in service of creating the best spatially intelligent AI systems, and using them to empower people.
- Collaborative Spirit: We're building something bigger than any one person. We need team players who can harness the power of collective intelligence.
We're hiring the brightest minds from around the globe to bring diverse perspectives to our cutting-edge work. If you're ready to work on technology that will reshape how machines perceive and interact with the world, World Labs is your launchpad.
Join us, and let's make history together.
Equal Opportunity & Pay Transparency
Equal Employment Opportunity
World Labs is an equal opportunity employer. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, genetic information, veteran status, or any other characteristic protected under applicable law. We welcome all qualified applicants and are committed to providing reasonable accommodations throughout the hiring process upon request.
California Pay Transparency
In accordance with California law, we disclose the following:
|
Pay Range |
$250,000-$350,000 base salary (good-faith estimate for San Francisco Bay Area upon hire; actual offer based on experience, skills, and qualifications) |
|
Total Compensation |
Base salary plus equity awards and annual performance bonus |
|
Salary History |
We do not request or consider prior compensation in making offers |
|
Compliance: Cal. Lab. Code §432.3 (pay scale disclosure & salary history ban); Cal. Lab. Code §1197.5 (Equal Pay Act); Cal. Gov. Code §12940 (FEHA); 42 U.S.C. §2000e (Title VII); 29 U.S.C. §621 (ADEA); 42 U.S.C. §12101 (ADA) |
Accommodations & inquiries: talent@worldlabs.ai
Apply for this job
*
indicates a required field