Dive deep into our ecosystem across awareness, acquisition, retention, revenue, and referral to uncover opportunities for growth and impact.
Conduct in-depth analyses to guide strategic decisions and identify new opportunities.
Design and evaluate experiments to ensure we release impactful features with measurable results.
Define and monitor KPIs, building scalable frameworks and dashboards that empower data-informed decision-making.
Shape the data foundations and processes for how we use data across teams, helping to build a world-class analytics practice.
Who you are
Were looking for exceptional individuals who combine technical excellence with ethical awareness, who are excited by hard problems and motivated by human impact. You ll strive with us if you:
Are passionate about audio AI driven by a desire to make content universally accessible and breaking the frontiers of new tech.
Are a highly motivated and driven individual with a strong work ethic. Our team is aware of this critical moment of audio AI evolution and is committed to going the extra mile to lead.
Are analytical, efficient, and strive on solving complex challenges with a first principles mindset.
Consistently strive for excellence , delivering high-quality work quickly and exceeding expectations.
Take initiative and work autonomously from day one, prioritizing learning and contribution while leaving ego aside.
What you bring
Expertise in product analytics, with a strong foundation in SQL/Python and familiarity with analytics tools across the data stack.
Strong product sense with an ability to put yourself in the shoes of our users and derive actionable insights from qualitative and quantitative data.
A deep understanding of key product metrics such as retention, activation, and LTV
Proven ability to design and analyze A/B tests, ensuring statistical integrity and actionable results.
Experience building data pipelines and dashboards that effectively communicate insights to stakeholders.
Not afraid to get your hands dirty to get the job done, whether it s implementing your own telemetry, diving into the codebase to debug an issue with logging, or implementing new tools as our data stack evolves.