Back to jobs

Senior Backend Engineer

São Paulo, Brazil

Join the Future of Cybersecurity with Radiant Security’s AI Revolution

 

Are you ready to dive into the world of Generative AI and be part of the team that’s transforming how security operations are done? At Radiant Security, we’re not just creating tools—we’re pioneering the industry’s first AI SOC Analyst, built to think and act like a top-tier security expert. Here, you’ll have the unique opportunity to learn and grow with cutting-edge AI technology. Our AI uses advanced algorithms to analyze security alerts, performing hundreds of dynamic tests in just minutes to deliver detailed incident reports, root cause analysis, and custom response plans. By the time an alert reaches a human analyst, the heavy lifting is done—they know whether it’s real, what caused it, and how to fix it.

 

As part of the Radiant team, you’ll work directly with this groundbreaking technology, learning how to harness its power to transform SOC workflows. Whether you’re looking to expand your skills in AI-driven automation, decision-making, or cybersecurity as a whole, Radiant Security is the place to grow and thrive

 

This is a hybrid position and we are attending the offices 3 times a week. We have offices in Pleasanton, California and São Paulo, Brazil.

 

What you’ll do

 

  • Collaborate with our AI to test, analyze, and respond to real-world threats faster than ever before.
  • Learn from an evolving system that continuously adapts to new threats and trends, keeping you at the forefront of cybersecurity innovation.
  • Empower SOC teams to get more done with step-by-step AI guidance, one-click API responses, or fully automated solutions.

 

Why join us?

 

In the Fabric Services team, you will be responsible for building and optimizing high-performance data platforms that support efficient data ingestion and querying at scale. You'll focus on developing software solutions that ensure the system’s reliability, monitoring, and scalability, working closely with cross-functional teams to address data challenges head-on.

At Radiant, you’re not just joining a company—you’re becoming part of a revolution. You’ll work with a passionate team, using the most advanced AI technology available, and have the chance to make a real impact on the security industry. Ready to grow with us? If you’re excited about working on the cutting edge of cybersecurity, learning from top minds in AI, and helping create a safer digital world, apply today!

 

Key responsibilities

 

  • Design and implement scalable data ingestion pipelines, ensuring efficient and reliable data flow from diverse sources.
  • Build out core platform tooling, monitoring, and engineering solutions that the rest of the company can leverage for their own performant apps.
  • Collaborate with engineering teams to integrate data solutions into the broader platform, ensuring system-wide performance optimization.
  • Monitor, troubleshoot, and enhance the performance of data pipelines and ingestion systems, ensuring minimal downtime and high reliability.
  • Optimize performance in large-scale datasets by applying best practices in indexing, caching, and partitioning.

 

Your experience

 

  • Solid programming skills in languages like Python, Golang, and NodeJS with experience in building scalable, high-performance data processing solutions.
  • Familiarity with cloud infrastructure (AWS, GCP, or Azure) and their respective data services (e.g., Redshift, BigQuery).
  • Experience building data pipelines using tools such as Apache Kafka, AWS Kinesis, or similar technologies.
  • Knowledge of distributed systems and techniques for building fault-tolerant, high-availability platforms.
  • Strong understanding of various database and search technologies (e.g. SQL, ElasticSearch, MongoDB) and performance tuning for large datasets, using distributed query engines (e.g., Presto, Snowflake).
  • Proven experience in monitoring and troubleshooting data platforms, using tools like Prometheus, Grafana, or Datadog.

 

Nice to Have

 

  • Experience with real-time data streaming and ingestion frameworks (e.g., Apache Flink, Apache NiFi).
  • Hands-on experience in containerization and orchestration (Docker, Kubernetes) for deploying scalable data services.

 

The process

 

  • People team interview
  • Coding interview
  • Architecture Testing
  • Virtual On Site: Leadership interviews

Apply for this job

*

indicates a required field

Resume/CV*

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter

Accepted file types: pdf, doc, docx, txt, rtf


Select...