Skip to main content
Dec. 12, 2024

Behind the Lens

Balancing Ethics and Innovation in Smart City Surveillance
Michael Wolman
Behind the Lens

Amid the hustle and bustle of cities across the country, a largely invisible network of surveillance cameras silently observes the ebb and flow of urban life. These cameras, embedded into traffic lights, streetlamps and even police body armor, promise to enhance public safety, streamline urban planning and optimize resource management. Yet, as their lenses capture the minutiae of daily life, a critical question arises: Who is watching the watchers?

At The University of Texas at Austin, one of Good Systems’ six core projects, Being Watched: Embedding Ethics in Public Cameras, aims to address this question. Led by an interdisciplinary team of researchers, the project focuses on how ethics, transparency and accountability can be embedded into the deployment of camera-based surveillance technologies. By developing a robust framework for governance, the project seeks to balance technological innovation with civil liberties.

“The rapid adoption of surveillance technologies has outpaced the development of guidelines to ensure they’re used ethically,” said project lead Sharon Strover, a professor in the School of Journalism and Media at the Moody College of Communication and former Good Systems chair. “We see a cottage industry promoting data collection and analytics, but far less attention to evaluating outcomes or addressing moral questions.”

Promises and Perils

The allure of smart city surveillance is undeniable. Cameras equipped with advanced analytics have the potential to improve pedestrian safety, traffic management and real-time crisis response, among other issues. Yet, as Strover said, these technological advancements come with a hidden cost. “People often don’t realize the extent of surveillance in their daily lives,” she said.

“Not everyone defines privacy the same way. It varies across cultures, age groups and even personal preferences. Our goal is to make these systems adaptable, so individuals or communities can choose their level of comfort.”

Atlas Wang

Cockrell School of Engineering

Focus groups conducted by Strover’s team revealed a sizeable gap between the public’s awareness and the reality of pervasive data collection — from drones operated by fire departments to cameras monitoring city intersections. Compounding the issue, many government agencies lack comprehensive data policies. “Few units have explicit guidelines for how data is handled or shared, and this creates a black hole of accountability,” said Strover, who also co-directs the Technology and Information Policy Institute.

Concerns about surveillance extend beyond privacy to issues of control and trust. While younger generations may view surveillance as an acceptable trade-off for convenient and affordable (or even free) access to the newest technologies, older groups express greater wariness. These generational and cultural differences highlight the challenge of crafting one-size-fits-all policies.

Customizing Privacy in Public Spaces

To address these complexities, project co-lead Atlas Wang and his team have developed a “differential access model,” a framework that restricts who can access surveillance data and for what purposes. “Not everyone defines privacy the same way,” said Wang, an associate professor in the Cockrell School of Engineering’s Chandra Family Department of Electrical and Computer Engineering. “It varies across cultures, age groups and even personal preferences. Our goal is to make these systems adaptable, so individuals or communities can choose their level of comfort.”

This model aims to ensure flexibility while respecting individual privacy. Wang described the innovation as a sort of digital switchboard of knobs and sliding scales: the user could place values on various aspects of privacy or utility, and the algorithms would respond accordingly. “Our algorithms are designed to customize privacy protections,” Wang said. “For example, we can blur faces, obscure sensitive activities or apply encryption based on the context.”

Take public safety, for instance. A city's police department could use the differential access model to detect suspicious behavior without identifying individuals unless an emergency warrants it. However, deciding who gets access to all those sensitive data — whether raw or processed, and whether it’s law enforcement, city planners or researchers — is a governance issue, according to Wang.