Jennifer Golbeck considers herself a private person, but she also specializes in technology with the potential to mine personal information to an unthinkable degree.
As the director of the Social Intelligence Lab at the University of Maryland, Golbeck studies the intersection of the two. Her research focuses on social media-driven analysis and predicting attributes, work that helps her design and build systems that improve how people interact with information online — and protect their privacy.
Golbeck spoke with The Baltimore Sun ahead of her keynote address at the Women to Watch networking event in October to talk policy, privacy and how to attract more women to her male-dominated field.
What do people not understand about artificial intelligence?
We hear the term all over the place. On one hand we get pushed this Matrix, Terminator AI is going to dominate the world narrative. And that’s definitely wrong. AI is really dumb in a lot of ways. We’re nowhere near and I don’t think we’re ever going to get near that.
The problem is when you put AI in the hands of people who can use it to get more power and money, they can find out all kinds of things about you that are incredibly invasive and use it to do all sorts of manipulative or unethical things. That’s happening right now, but it’s so far behind the scenes that we don’t know about it a lot of the time. That’s actually the thing we should be worrying about.
How can AI be better incorporated into health care without incentivizing profits over patient care?
There’s creepy stuff going on with AI in the health care space. We see a lot of concern … about how [women’s] period tracking data is going to be leveraged. Will companies be running AI to predict that [a woman] had an abortion and giving that [information] to law enforcement?
At the same time, there’s life changing work being done. There was one study where they had all this historical data from a brain and memory clinic. They trained an algorithm to look at scans and predict if people would develop Alzheimer’s. The algorithm could do that with 99% accuracy. It could make that prediction six years before a human physician would be able to see it on a scan.
It not that AI is the problem. It’s the application of it.
AI was put to use during the COVID pandemic. What lessons did we learn?
Early on in the pandemic, Apple and Google worked together to come up with a contact tracing feature in phones. There could be a zillion ways for that to go wrong, but they came up with some good privacy respecting ways to do this where your personal data is not shared.
A lot of people didn’t opt in. It was a totally rational choice given how bad every tech company has been with using our data. I think everyone has come out of the pandemic with lower trust in institutions.
In the tech space, there’s very powerful things we could have done to make this better, and there was a lack of political will in some places and a well-earned lack of trust in tech companies. This is a real challenge because if and when the next pandemic comes along, we’ve learned so much and we’ve got a lot of great tools that we can use. I think, rightfully, a lot of people are like no way, and that’s the thing we have to fix. We need really strong policy change to protect our privacy, so we know these companies can’t skirt around what is a right and ethical thing to do.
How are women represented in your field?
It’s awful. When I was a graduate student, I was pretty much always the only woman in the room. It’s sort of like an old boys network. It’s culturally hard to be a woman in those spaces. I think people underestimate that a lot of women avoid fields where they’re constantly told they don’t belong there or harassed when they are there, whether it’s computer science or anything else.
We need more women. Lots of places have recognized it’s a problem. It’s a big fight, but we’re trying. We’re getting kids interested at a young age. I think working on that cultural shift and having powerful voices saying “all this bad acting is not OK.” That’s going to be the really necessary thing to get a big shift.
(This interview has been edited for space.)