QA Specialist - Minor Safety and Exploitative Content
Discord is used by over 200 million people every month for many different reasons, but there’s one thing that nearly everyone does on our platform: play video games. Over 90% of our users play games, spending a combined 1.5 billion hours playing thousands of unique titles on Discord each month. Discord plays a uniquely important role in the future of gaming. We are focused on making it easier and more fun for people to talk and hang out before, during, and after playing games.
We are looking for a detail-oriented professional with a strong passion for safeguarding vulnerable groups and combating exploitative content online. As a QA Specialist for Minor Safety and Exploitative Content (MSEC) at Discord, you will play a pivotal role in ensuring the accuracy, consistency, and quality of moderation decisions that uphold our community standards. This role reports to the Team Lead for Trust & Safety QA and will partner closely with MSEC. Your approach to quality assurance is rooted in empathy, precision, and a commitment to continuous improvement.
What You'll Be Doing
- Review and audit moderation decisions related to minor safety and exploitative content to ensure adherence to Discord’s Trust & Safety policies.
- Collaborate with moderators, analysts, and policy teams to identify trends, gaps, and inconsistencies in content review processes.
- Provide constructive feedback and actionable insights to moderators to improve decision-making accuracy and maintain policy alignment.
- Develop and lead calibration sessions for the moderation team based on audit findings and evolving content standards.
- Partner with MSEC and other cross-functional teams to influence policy updates and improve internal tools and workflows for greater efficiency and scalability.
- Regularly report on quality trends and metrics, highlighting risks, successes, and opportunities for process improvements.
What you should have
- 2+ years of experience in quality assurance, trust & safety, or content moderation, preferably in a tech or online platform environment.
- Deep understanding of issues related to minor safety, exploitative content, and global online safety trends.
- Excellent analytical skills with the ability to synthesize large datasets and translate them into actionable insights.
- Strong communication skills, both written and verbal, to effectively convey findings and train teams.
- Familiarity with moderation tools, audit processes, and metrics-driven performance tracking.
- A calm, resilient demeanor when handling sensitive or potentially distressing content.
- Ability to flex your expertise to support other QA initiatives, including automation and machine learning, violent and hateful content, cybercrime, and other exploitative content.
Bonus Points
- Experience working on global teams or in environments that require cultural sensitivity and awareness.
- Experience with data analytics tools and languages like SQL.
- Proficiency in multiple languages to support international moderation efforts.
- Demonstrated success in driving cross-functional initiatives or policy changes in a Trust & Safety context.
- Experience working with machine learning systems, automation tools, and LLM/AI technologies.
Requirements
- This role requires regular interfacing with potentially traumatic material, including CSAM and other forms of exploitative, hateful, violent, or shocking content.
- This role's hours are Monday-Friday, 9:00 AM to 5:00 PM Pacific Standard Time, with occasional flexibility required to accommodate our global partners.
#LI-Remote
The US base salary range for this full-time position is $108,000 to $121,500 + equity + benefits. Our salary ranges are determined by role and level. Within the range, individual pay is determined by additional factors, including job-related skills, experience, and relevant education or training. Please note that the compensation details listed in US role postings reflect the base salary only, and do not include equity, or benefits.
Why Discord?
Discord plays a uniquely important role in the future of gaming. We're a multiplatform, multigenerational and multiplayer platform that helps people deepen their friendships around games and shared interests. We believe games give us a way to have fun with our favorite people, whether listening to music together or grinding in competitive matches for diamond rank. Join us in our mission! Your future is just a click away!
Please see our Applicant and Candidate Privacy Policy for details regarding Discord’s collection and usage of personal information relating to the application and recruitment process by clicking HERE.
Apply for this job
*
indicates a required field