The Cavalier Daily
Serving the University Community Since 1890

“Co-Opting AI: Privacy” event explores the dangers of data collection

The virtual event is part of a public speaker series focused on AI technology and its application in society

<p>The “Co-Opting AI” series is <a href="https://datascience.virginia.edu/people/mona-sloane" rel="noopener noreferrer" target="_blank">moderated</a> by Mona Sloane, faculty co-lead of the University’s Digital Technology for Democracy Lab and assistant professor of data science and media studies. </p>

The “Co-Opting AI” series is moderated by Mona Sloane, faculty co-lead of the University’s Digital Technology for Democracy Lab and assistant professor of data science and media studies.

University Law Professor Danielle Citron and Jasmine McNealy, professor at University of Florida’s department of media production, management and technology, examined the risks of data collection at an event Wednesday. Citron and McNealy discussed the gathering and selling of information through sound recordings and health apps, as well as the future of regulation on data distribution.

The University’s Karsh Institute of Democracy, the University’s Sloane Lab and New York University’s Institute for Public Knowledge co-hosted the all-virtual event. 

This event was part of the “Co-Opting AI” public speaker series, focused on AI technology and how it interacts with society in ways ranging from language to athletics. This event, called “Co-Opting AI: Privacy,” focused on how artificial intelligence systems impact privacy and how to address the challenges they raise.

The “Co-Opting AI” series is moderated by Mona Sloane, faculty co-lead of the University’s Digital Technology for Democracy Lab and assistant professor of data science and media studies. Sloane currently runs the Sloane Lab, which conducts research on the implications of technology for the organization of social life, with a focus on AI as a social phenomenon in the modern world.

Data is collected through everyday browsing, clicking on ads or even hovering over an image for longer than usual, according to Citron. Oftentimes, organizations collect this data to encourage individuals to make certain purchases or engage in certain services. The event began with Sloane recognizing that a crucial and inevitable part of AI is data collection, although this collection can lead to breaches of privacy. 

“Not much AI would be possible without some form of privacy invasion, simply because AI is not possible without enormous amounts of data,” Sloane said. “Data for AI is collected as we move through our digitally mediated work.”

McNealy began the discussion of privacy by addressing personal identity and how it is expressed through data.

“I think there's an intense conflict between who we are and who we are computed to be,” McNealy said. “What that means for us as we have to participate in systems that want to know what and who we are.”

Specifically, McNealy discussed sound collection and recording devices in public spaces, and what data brokers do with that collected information. This data is also known as bioacoustics. McNealy mentioned the example of bioacoustics being recorded in subway trains to diagnose potential track problems, as the Metropolitan Transportation Authority in New York City implemented in February. 

McNealy also referenced the example of bioacoustics in applications that arose during the pandemic, allowing individuals to cough into a device and then receive a diagnosis based on the sound. McNealy used this example to emphasize how much information a small sound may contain.

“We have absolutely no idea where this store of cough [sounds] is and how it's being used,” McNealy said. “So why do I care? Because your sound tells so much about you … and inferences or predictions can be made about you.”

McNealy expressed concern over not only the data collection itself, but the lack of control users have over it and what it is used for once it has entered data brokers’ systems. To combat this, McNealy called for federal intervention to turn authority back to the users instead of the data collectors.

“We absolutely need federal intervention into this space,” McNealy said. “That intervention needs to be on the side of us retaining some kind of authority over deciding who we are, instead of machines … deciding who we are and what we could be.”

Citron focused on the impact of privacy invasion on women and other marginalized groups, specifically in terms of non-consensual intimate imagery. This may include nudity or other images of exploitation and shame shared without consent. 

“More often people who are pictured are women, they're LGBTQ individuals, and more often they are of a marginalized identity of another sort,” Citron said. 

Citron also discussed health information apps, such as menstrual cycle tracking apps, that gather and sell information that users think is private. She also addressed why users of these apps should be worried about the distribution of their information. Not only is it being used to tailor advertising, but it’s also being used in ways to determine their life and the opportunities presented to them. 

“[The data] is being used to make decisions, to classify us and score and rank us, which is used for important life opportunities, [like] whether we get jobs or promotions,” Citron said. “If you've got really painful cramps, maybe I don't want to hire you, right?”

McNealy and Citron also discussed the role of consent in data sharing. This consent often comes in the form of accepting privacy policies or accepting cookies on a website. Both concluded that requiring consent from users will not have a significant enough impact on the dangers of privacy invasion.

McNealy made the argument that individuals are too uninformed on the process and purpose of data distribution to make decisions about how and when their data can be shared.

“[Requiring consent] is putting the onus on the individual to make a decision about themselves that they have absolutely no expertise in,” McNealy said. “It ignores the inherent power imbalance between an individual and an organization.”

Citron furthered this argument with the idea of meaningful consent as both magical and mythical. She highlighted that more often than not, people scroll past privacy policies and just accept whatever terms appear on their screen.

“We think of consent as like sprinkling fairy dust on all things, right? That consent can cure all problems,” Citron said. “And the truth of the matter, it's much more myth than it is real.”

McNealy and Citron also explored solutions of how to regulate data collection and distribution in the future. While both addressed the difficulty in forming a concrete plan that will solve all of the problems with data collection, McNealy emphasized the importance of community consensus in how public records and other data archives operate.

“There has to be a mechanism created for communities to be actively involved in consent and governance of data, collection, use, storage, access, all of those parts of the [data] life cycle,” McNealy said. “That doesn't solve everything, but it is so different than what we now have available.”

The next “Co-Opting AI” event in the series is “Co-Opting AI: Taxes” April 16. Registration is open on the Digital Technology for Democracy Lab website.

Local Savings

Comments

Puzzles
Hoos Spelling
Latest Video

Latest Podcast

In light of recent developments on Grounds, Chanel Craft Tanner, director of the Maxine Platzer Lynn Women’s Center, highlights the Center’s mission, resources and ongoing initiatives.