This job is closed
We regret to inform you that the job you were interested in has been closed. Although this specific position is no longer available, we encourage you to continue exploring other opportunities on our job board.
User Protection is an organization dedicated to protecting Google's users from abuse, account compromise and other harms online. Our team works with the Content Safety Platform (CSP) pillar, which develops tools to protect users from abusive content at scale - often leveraging AI technology to do so. Our team provides data science capabilities to Content Safety Platform, and works directly with product and engineering to evaluate, understand, and improve the quality of our protections. Organizationally, we are a part of a large data science team in Core, which provides ample opportunities for knowledge sharing, development, and learning from other data scientists working in adjacent domains. Content Safety Platform equips Google products with tools to protect users from abuse and harm. As a Data Scientist working with CSP, you'll be helping to evaluate, understand, and improve our abuse protections - which are generally built with and for AI tools. CSP Data Scientists work closely with cross-functional product teams on specific content safety classifiers, but also on generic strategies and tooling for understanding content safety classifiers. Product safety is critical to the success of nearly all of Google's products - especially novel AI tools.
A Smarter and Faster Way to Build Your Resume