Architect, build, and maintain our anti-abuse and content moderation infrastructure designed to protect us and end users from unwanted behavior.
Develop ubiquitous moderation coverage across our deployments
Expand our internal safety tooling and infrastructure
Collaborate with broader engineering teams to design and build safety mitigations across our product suite
Collaborate with our data team to develop and maintain actionable safety metrics
Who you are
Were looking for exceptional individuals who combine technical excellence with ethical awareness, who are excited by hard problems and motivated by human impact. You ll strive with us if you:
Are passionate about audio AI driven by a desire to make content universally accessible and breaking the frontiers of new tech.
Are a highly motivated and driven individual with a strong work ethic. Our team is aware of this critical moment of audio AI evolution and is committed to going the extra mile to lead.
Are analytical, efficient, and strive on solving complex challenges with a first principles mindset.
Consistently strive for excellence , delivering high-quality work quickly and exceeding expectations.
Take initiative and work autonomously from day one, prioritizing learning and contribution while leaving ego aside.
What you bring
Strong experience in Python, including asynchronous Python; proven track record of building production Python applications.
Strong background in backend development including setting up and maintaining production backend services and data pipelines.
Familiarity with common software and system design patterns and infrastructure including APIs, cloud infrastructure tools, storage solutions, data structures etc.
Knowledge about test design and security fundamentals.
Strong candidates will also have a mix of experience working with:
Backend safety infrastructure tooling
Trust and safety, integrity or AI safety teams
SQL and data analysis tools;
React
Technical investigators and machine learning engineers to integrate ML/AI models to detect, monitor and enforce on abusive content