Portnoff is using her computer science background to build technology that defends children from sexual abuse

Rebecca Portnoff ’12
Courtesy of Rebecca Portnoff ’12

Before there was ChatGPT or the Oscar-winning film Her, Rebecca Portnoff ’12 had already turned an academic eye toward artificial intelligence, or AI: As a computer science concentrator, Portnoff wrote her thesis on machine learning and natural language processing. But she wasn’t sure where that work would take her after graduation. 

Meanwhile, she developed a second interest: Inspired by the book Half the Sky, by Sheryl WuDunn *88 and Nicholas Kristof, and their reporting on the work being done to combat human rights abuses against women and girls around the world, “I came to the conclusion that I wanted to work in this space, but I had little to no clarity on what it meant for a computer scientist to contribute to this kind of effort,” Portnoff said. 

After starting a Ph.D. program at the University of California, Berkeley, she began cold-calling law enforcement and nonprofits to learn more about how her skills could help fight such abuses. “I really learned so much from these folks on the ground about the reality of this issue space, what ways that technology exacerbates the problem, and what ways that technology can be used effectively to combat the problem,” she says. 

Eventually Portnoff found her way to Thorn, an international organization that works to build technology that can defend children from sexual abuse. She now leads a team of researchers at Thorn focused on data science, who have their work cut out for them. 

The National Center for Missing and Exploited Children reportedly received over 100 million files, including images, of suspected child sexual abuse material in 2023. “Since the advent of the internet, the number of reports and files reported has just exponentially increased,” Portnoff says. “This is just an obscenely huge number, and requires collaboration and also technology to be able to effectively navigate the scale of that kind of content.” 

Portnoff's team works with law enforcement and hosting platforms to find, remove, and record each instance of child sexual abuse material, and they even try to prevent the material from being created in the first place. While AI can help accelerate her team’s mission, Portnoff has also seen AI exacerbate harm: For example, “bad actors” can misuse generative AI technologies to create realistic AI-generated child sexual abuse material. 

Portnoff is now leading an initiative with the nonprofit All Tech Is Human to get technology companies leading in the AI space — including Open AI, Google, and Meta — to commit to new safety measures that can prevent the misuse of AI as it pertains to sexual abuse against children. At this point, Portnoff says, these companies need to be held accountable to keep their commitments to protect children, and she is dedicated to continuing this work. 

“It can be really challenging to look into the face of evil like this — for me to do that requires holding on to hope that we’re going to see justice in the long run,” Portnoff says. “I know that there are real challenges that persist, but I also have hope that in collaboration, we can have the impact that we’re seeking to have. 

“I think that in this space, it can be sometimes easy to raise your hands and say, ‘Well, it’s too complicated, there's nothing we can do about it.’ I strongly reject that. I think that there are many very practical, actionable things that can be done to have impact here. I’m going to keep working at that as long as I still have breath.”