Moments after a grand jury declined to charge Darren Wilson, the Ferguson, Mo., police officer who shot and killed Michael Brown in 2014, Brown’s parents issued a statement urging supporters to “join with us in our campaign to ensure that every police officer working the streets in this country wears a body camera.”
The idea of using technology to increase police accountability and transparency had been gathering momentum, and the U.S. Department of Justice soon jumped on board. In 2015 and 2016, it awarded $36 million in grants to local police departments for body-worn cameras.
But many civil-rights activists saw the adoption of cameras “in a very different light,” says Harlan Yu *12, a principal at Upturn, a tech-policy consulting group based in Washington, D.C. With camera footage under the control of the police department, there would be no guarantee that it would be made available to the public as evidence for either a criminal defense or police-conduct complaint. And the cameras point not at officers, but at the people in communities they’re patrolling, significantly increasing public surveillance in the most heavily policed neighborhoods. With the added potential to incorporate advanced biometric tools like facial recognition, some feared that cameras would expand existing disparities in law enforcement. (Axon and Motorola, two of the leading body-camera vendors, have since publicized plans to offer real-time identification tools built into their devices.)
“Even a technology that seemingly would be used in the interest of the community is being reframed and reused as another tool for police power,” Yu says.
Yu, a computer science Ph.D., and his Upturn co-founder, David Robinson ’04, were well positioned to contribute to the emerging body-camera debate. In 2014, they had worked with civil-rights leaders to draft “Civil Rights Principles for the Era of Big Data,” guidelines for fairness, privacy, and equal opportunity that drew the attention of White House adviser John Podesta, then the head of President Barack Obama’s working group on privacy and big data. When Podesta’s group made its report to the president in May 2014, the cover letter echoed the principles that Upturn had highlighted. “[B]ig-data analytics have the potential to eclipse long-standing civil-rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace,” Podesta and his colleagues wrote. “Americans’ relationship with data should expand, not diminish, their opportunities and potential.”
In May 2015, alongside partners in the civil-rights community, Yu and Robinson helped to develop a new set of principles, this time specifically addressing the use of body cameras. More than 30 organizations signed on, ranging from the ACLU to the NAACP to the Electronic Frontier Foundation, a leading tech-privacy nonprofit in San Francisco. They weren’t advocating for or against body cameras. It was more of an “if ... then ... ”: If departments are going to adopt cameras, then these are the guidelines they should follow. Upturn and the Leadership Conference on Civil and Human Rights followed up with a meticulously researched “policy scorecard,” first released in November 2015 and updated twice, grading the body-camera policies of major police departments.
The results were less than promising: On a scorecard with eight criteria, such as “addresses personal privacy concerns” and “makes footage available to individuals filing complaints,” most department policies satisfied fewer than half. But according to Sakira Cook, senior counsel at the Leadership Conference, the scorecard — and the media coverage it generated — “had a huge impact on influencing the conversation around body cameras.” Law-enforcement officials reached out to the Leadership Conference to discuss the areas where they were falling short and the ways they might improve their policies (and scores).
Ben Wizner, director of the ACLU’s Speech, Privacy, and Technology Project, points to the body-camera effort as an example of Upturn’s role in the civil-rights arena. “Here’s an instance of a proliferating new technology that has clear benefits and clear dangers: the benefit in being able to improve police accountability and the danger of creating another widely available surveillance tool,” Wizner says. “The work that they did in trying to balance those dynamics in a rapidly changing technological environment was really laudable.”
Recent research has supported the civil-rights community’s skepticism: In a study published in October, the Lab @ DC, a team of social scientists working for the Washington, D.C., city administrator, found that the Metropolitan Police Department’s use of body cameras had “no detectable, meaningful effect on documented uses of force.” Shifting policies, attitudes, and actions won’t be easy, but Yu is encouraged by the way that civil-rights leaders have engaged the issue. “It just means that we have to keep fighting harder to push back against the most egregious practices that we see,” he says.
Working in a few glass-partitioned rooms within a fashionable shared-office suite on K Street, Yu and Robinson have carved a niche at the intersection of technology and public policy. Upturn has the technical chops to design and create complex computer code (it built legislation-drafting software now used by the House Office of the Legislative Counsel), but its larger impact is making tech-policy debates more accessible and inclusive — for example, through technology primers, research reports, and policy memos written for civil-rights and social-justice groups. What started as a two-person partnership has grown to a group of five that operates primarily behind the scenes, with the help of grants from the Ford Foundation, the MacArthur Foundation, and others. They are, in Robinson’s words, “happy warriors,” with a watchful eye on potential pitfalls of technology and an underlying optimism about its potential to improve lives.
Robinson, a philosophy major, Yale Law graduate, and Potomac, Md., native, traces his hopeful view of technology back to his childhood. Born four weeks prematurely, he has a mild case of cerebral palsy, which made writing difficult during his elementary-school years. “When ‘writing’ on your report card meant penmanship, I was terrible at it,” he recalls. “Then at some point I got a computer, and it was life-changing for me because it turned out that I love writing — and I’m able to do it well. ... Technology has such a powerful, liberatory role in my own life.”
Yu grew up in San Jose, Calif., in the heart of Silicon Valley, and studied electrical engineering and computer science at the University of California, Berkeley, before starting graduate school at Princeton. He has an easygoing manner underscored by technical virtuosity, says J. Alex Halderman ’03 *09, a professor of computer science and engineering at the University of Michigan and longtime friend of both Yu and Robinson. “Even if it’s a complicated technical issue, working with Harlan on it, you feel like you’re just hanging out, everything’s cool,” Halderman says, “and then the problem’s solved.”
Based solely on their academic backgrounds, one could paint the two as complementary parts of a cohesive whole — Yu the technologist and coding wizard; Robinson the lawyer and wordsmith — but in reality, their partnership thrives on shared traits. Yu deftly bridges technical and intellectual issues in the policy pieces and op-eds that he writes, while Robinson has an enduring, hands-on fascination with computers.
Upturn’s larger impact is making tech-policy debates more accessible and inclusive.
Their work at Upturn took root at Princeton, where the two met during the early days of the Center for Information Technology Policy (CITP), founded by Professor Edward Felten with the aim of using technology to address practical challenges in the policy realm. (See related story.) Yu, one of Felten’s Ph.D. students, had just finished a summer working in California on a state-sponsored review of electronic voting machines. Robinson, who studied philosophy and politics as a Rhodes scholar at Oxford, was coming back to campus to work as CITP’s first associate director.
CITP, at the time, was relatively small — a place where students, faculty, and staff would gather around a lunch table and hash out ideas for new projects. Those lunchtime conversations were the genesis of Yu and Robinson’s first formal collaboration, along with Felten and William Zeller *08: a paper titled “Government Data and the Invisible Hand.” In it, they argued that when it came to data, the government should focus on providing the information in machine-readable formats such as XML. That would improve access to official data, allowing entrepreneurs to develop useful apps, sites, and data visualizations even while the government upgraded agency websites. The paper proved influential in the Obama administration’s open-data initiatives, including the 2009 launch of Data.gov, now home to data sets from nearly 200 federal, state, and local organizations.
After two years at CITP, Robinson left to begin law school at Yale while Yu continued to pursue computer-science projects that incorporated his policy interests. His dissertation work included a deep dive into the U.S. Code of Laws, the official record of federal statutes. “The big-picture idea that Harlan had was to take what computer scientists know about managing large, collaboratively written texts and to figure out how to apply it to the writing of U.S. statutes,” Felten says. Over time, the U.S. Code has accumulated redundancies, errors, and inconsistencies — analogous to bugs in a large, frequently patched set of computer code — so Yu set out to “debug” the statutes in a way that would help legislators drafting future laws to avoid repeating or compounding past mistakes.
Before his final year of law school, Robinson interned at a D.C. law firm. Yu was also living in Washington, where he’d worked on an open-government project at the U.S. Department of Labor before beginning the final drafts of his dissertation. Both had promising, traditional career paths ahead — but they couldn’t shake an idea they’d been discussing: a consulting group that would draw on the kinds of work they’d found so compelling at Princeton. CITP had a mission to inform stakeholders, Robinson says, and its work was intended to be neutral; Robinson and Yu wanted to use the same meticulous, analytical approach to work with advocates involved in issues they cared about. In July 2013, the Ford Foundation — in the midst of launching efforts to promote internet freedom — gave them a significant boost with a grant to support civil-rights and privacy advocates with research on technology issues related to financial data and online voter registration.
“Ultimately, we hope for a world in which all participants understand how technologies work — and why they matter.”
— David Robinson ’04
Robinson and Yu aimed to make their partners feel confident in tech conversations, but first, they had to gain a clearer understanding of the civil-rights community. “It was a learning process on both sides,” Yu says. “We spent a lot of time listening, talking with groups on the ground, and reading, because when we first started working in this area, we didn’t have a history of doing social-justice work. A lot of the context was still very new to us.”
The give-and-take resonated with Wizner, the ACLU’s technology-project director. “For too long there’s been a divide in discourse and understanding between the technology community and the rest of us in how we try to communicate with each other,” he says. “What Harlan and David are both able to do so effectively is to translate very complex issues into simple language without dumbing it down and without being condescending. Not all technologists have that ability.”
In the last five years, Upturn, which now operates as a nonprofit, has helped its partners curb the proliferation of targeted online ads for payday loans (Facebook and Google banned them in 2015 and 2016, respectively). Robinson also is leading an effort to push back against the use of algorithms to predict risk in bail decisions. In many places in the United States, he argues, the bail system “tends to put people in jail for being poor,” so using historical patterns to set bail amounts may perpetuate past biases. Upturn is involved in the fight for internet freedom abroad, working with Halderman and other academics who are engineering networks that seek to circumvent censorship in countries like China and Iran. And Yu and Robinson have an eye on future issues, such as social equity in the development of self-driving cars, including the potential for carmakers’ mapping technology to create a form of “redlining,” restricting where autonomous vehicles can and cannot drive. “Ultimately, we hope for a world in which all participants understand how technologies work — and why they matter,” Robinson says.
Yu says places like Upturn and CITP — and computer-science departments in general — can show those in the field how they can “pull on various levers of policy,” which in turn makes computer science a more attractive subject for those with interests that range beyond coding. At Princeton, where computer science is the most popular major among seniors for the second straight year, the department has set a goal to enroll every undergraduate in at least one of its courses.
“We need many more computer scientists and technologists focusing on the core social problems — in housing, in education policy, in health policy, in all sorts of core areas where new technologies are going to shift the landscape,” Yu says. “As technology continues to permeate all aspects of our society, there’s just going to be a greater need for this kind of work.”
Brett Tomlinson is PAW’s digital editor.