Professor shares insights regarding online behavior at Mudd Center lecture

Jennifer Goldbeck discusses logarithms’ role in online activity at speech Thursday

Elyse Ferris

Professor Jennifer Goldbeck from the University of Maryland, College Park emphasized the influence of logarithms and how they affect people’s online activities during a lecture at Washington and Lee on Thursday.

The Mudd Center for Ethics hosted “Footprints in the Digital Dust: How Your Online Behavior Says More Than You Think.” Goldbeck, the director of the Social Intelligence Lab and associate professor of information studies at UMCP, is a world leader in social media research and science communication.

Goldbeck cited studies in which logarithms collected various pieces of information from people based off their activities on the internet.

“I build algorithms, and they take data from social media and they find things out about you. I think it’s really surprising to see the ways they work,” Goldbeck said.

Goldbeck mentioned colleagues from Cambridge who predicted behavioral data, political preferences, intelligence and other attributes of Facebook users entirely from analyzing their ‘likes.’

Goldbeck noted two important factors that contributed to the Cambridge study: homophily, defined as the tendency for people to associate with similar people, and the spread of ideas in social networks.

“You can’t control what the algorithms find out about you, because they’re not relying on logical connections,” Goldbeck said. “You can’t be careful about what the algorithms will infer about you because you don’t know what action is going to tell them. You can’t understand or explain it or get out of it.”

Goldbeck argued that some algorithms are helpful, like Netflix’s movie and show suggestions based on users’ preferences, but some are overreaching.

She mentioned a case in which a Minnesota target exposed the pregnancy of a teenage girl to her parents.

Target sent the high schooler advertisements and coupons for baby items based on the girl’s purchase history, which led the algorithm to predict that she was pregnant. The girl’s father complained to his local Target, only to learn from his daughter that she was actually pregnant.

Other times, Goldbeck said, algorithms can be downright inaccurate.

Goldbeck pointed to Cathy O’Neil’s book Weapons of Math Destruction, in which she describes an algorithm used by D.C. public schools that led to the termination of strong teachers.

“There is a great deal of reliance on automation being correct even when it’s not,” Goldbeck said.

Goldbeck and her peers also use algorithms to predict the future.

She built an algorithm that predicts continued sobriety or relapse in new members of Alcoholics Anonymous. The algorithm considers data used by addiction researchers, such as coping style and social support.

“We can predict with about 85-90 percent accuracy on the day you go toAAif you are going to be sober in 90 days,” Goldbeck said. “The benefit of the way that we did it is that if it says no, it’s not going to work, it can tell you why. I hope that’s the way this field is going.”

Goldbeck pointed out that privacy is a major problem concerning the collection of data fed into these algorithms. Many helpful apps, such as Uber, constantly track users’ locations and can pull virtually any information from their phones.

“Personal privacy is a luxury item. We’re in this environment now where you don’t have any ownership of this data,” Goldbeck said.

Crystal Knows, a tool that helps users write empathetic emails, uses information from past communications to analyze the personalities of email recipients.

“We’re being helped. Most people are okay with that happening. But what if I started feeding in your personal emails with your spouse to those algorithms? What are we okay being used as inputs? Who are we okay with seeing the outputs? Where is the line? That line varies hugely for people,” she said.

Goldbeck stressed the need for immediate ethical and legal discussions about the invasive nature of this use of data.

“Once people start using [these tools] and making money from them, it’s going to be really hard to take them away and change the rules . . . Consent ultimately, for me, is the core issue underlying all of this,” she said.