Back to jobs
New

Perception Engineer

Charlotte, NC


About Lucid Bots

Lucid Bots Inc. is an AI robotics company that extends human reach by building the world's most productive and responsible robots. Our robots allow people to perform dangerous and demanding tasks without putting human life and safety at risk.

Headquartered in Charlotte, we design, engineer, manufacture, and support our products domestically. Our current line of production-ready robots includes the Sherpa, a cleaning drone, and the Lavo Bot, a pressure-washing ground-based robot. Our products elevate safety and efficiency for customers all around the world. Lucid Bots was recently recognized as the fourth fastest-growing manufacturing company in the United States.

We are venture-backed, with multi-round investments from Y Combinator (S19 batch), Cubit Capital, Idea Fund Partners, Danu Ventures, and others. Learn more about our vision.

Our Core Values and How We Work Together

Meeting the demands of our fast-growing venture-backed startup requires commitment from every team member—a commitment to each other, to the customers, and to the broader community. Our core values guide us in how we fulfill that commitment every day:

  • Lead with Compassion: Foster an inclusive and supportive environment where people feel valued and heard.
  • Grow with Purpose: Embrace learning, adaptation, and intentional decision-making.
  • Win as One Team: Collaborate and share ownership of our vision and outcomes.
  • Be Positive Agents of Change: Take ownership and drive impactful actions.
  • Pursue Extraordinary Impact: Solve meaningful problems with innovation and effort.

Role: Perception Engineer

Goal of This Role

We are seeking an experienced Perception Engineer to own and advance the perception stack for our proprietary Lucid Operating System. Your work will be critical to enabling robust autonomy in environments where traditional sensors, such as LiDAR, underperform due to environmental conditions like mist and fog. You will drive innovation in SLAM and VIO systems, shaping the foundation for Lucid Bots' long-term autonomy roadmap.


How You Will Flourish in This Role

Success in this role requires deep expertise in perception systems, a passion for rapid iteration using AI-first tools, and a willingness to get hands-on with real robots in real-world conditions. You'll thrive at Lucid Bots if you're driven by impact, energized by cross-functional collaboration, and motivated to push the boundaries of autonomous systems. You’ll be a mentor, a systems thinker, and a builder.


What You'll Do:

  • Own the end-to-end perception stack for both aerial and ground robotic platforms.
  • Develop robust SLAM and VIO pipelines that fuse data from cameras (mono/stereo), IMU, wheel odometry, and LiDAR (when available).
  • Tackle the challenges of perception in adverse conditions like fog, spray, and dust.
  • Lead integration of perception systems into ROS 2 and PX4 environments.
  • Utilize AI-driven development tools (e.g., Cursor, VS Code agents, Copilot) for efficient prototyping, testing, and refactoring.
  • Drive rapid deployment and field validation cycles.
  • Mentor team members on best practices in perception architecture and AI toolchains.

Skills & Qualifications

Must-Haves:

  • 4+ years experience in building production-grade SLAM or perception systems for robots.
  • Deep proficiency with ROS 2, real-time C++, and Python.
  • Demonstrated success with camera-only autonomy under adverse environmental conditions.
  • Expertise in sensor fusion using camera, IMU, and odometry data.
  • Advanced fluency in AI-assisted coding tools (e.g., Cursor, Copilot, GPT agents).
  • Strong hands-on experience with CUDA, OpenCV, PyTorch, and on-robot debugging in field environments.

Nice-to-Haves:

  • Experience with Jetson/Xavier bring-up and GPU pipeline optimization (GStreamer, Vulkan).
  • Cloud-based map storage and retrieval systems.
  • Contributions to open-source projects or published research in SLAM or visual odometry.

Why Join Lucid Bots?

  • Work with cutting-edge AI and robotics technology in a fast-growing startup.
  • Collaborate with a passionate, innovative, and ambitious team redefining safety and efficiency.
  • Gain mentorship and leadership insights from the Founder/CEO and senior leadership.
  • Enjoy opportunities for continuous learning and professional growth in a values-driven environment.

What We Expect You to Achieve in This Role

Within the first 3–6 months:

  • Deliver a robust camera-only perception pipeline capable of handling mist and spray on Lavo Bot and Sherpa.
  • Integrate SLAM/VIO into production-ready systems within ROS 2 and PX4.
  • Validate system performance under real-world conditions and iterate quickly based on feedback.

Long-term:

  • Set the foundation for our scalable autonomy architecture.
  • Lead Lucid Bots into the next phase of field-tested perception capabilities across multiple platforms.

Additional Notes

  • Location: This role is based in Charlotte, NC. Partial remote flexibility possible, but significant on-site presence is expected for hardware validation.
  • Travel: Occasional travel to field test sites may be required.

A Message from Our CAIO

At Lucid Bots, we’re not just hiring for a role—we’re bringing on people who believe in what we’re building. Watch this short video from our Chief AI Officer, Vic Pellicano, to hear why this role is important and why we’re excited to have someone like you join the team.

 

We’d Love to Hear from You!

We encourage you to respond to Vic’s video with your own short video (under 2 minutes) explaining why you’re excited about this opportunity and why you’d be a great fit for Lucid Bots. This is a chance to introduce yourself in a way that goes beyond your resume!

If you’re not sure where to record, you can use a tool like Loom.

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


If you are not currently local to Charlotte, NC, are you willing to relocate? *
Select...