News & Events

ICDS News

Creepy computers or people partners? Working to make AI that enhances humanity

Posted on April 11, 2019

Driving Digital Innovation in AI is a series of stories about how Penn State’s Institute for CyberScience researchers are studying machine learning and AI to make these technologies more useful and effective, as well as investigating the impact of AI on people and society.

New technologies — from the stone age to the atomic age — tend to inspire hope and fear in equal measure.

Artificial intelligence is no different.

For some, artificial intelligence — AI — could herald a new age of personalized medicine, self-driving cars and a myriad of other applications that will make life better, smarter and more convenient. For other people, though, that eerie ability to do things formerly done only by humans makes AI a potential threat to both a sense of control and privacy.

For S. Shyam Sundar, James P. Jimirro Professor of Media Effects at Penn State’s Donald P. Bellisario College of Communications and an Institute for CyberScience (ICS) affiliate, the transformative potential of AI means we must carefully study how we adopt and integrate this technology into everyday life. Sundar, who is co-director of the Media Effects Research Laboratory at Penn State, and his team of graduate researchers at the laboratory focus much of their attention on how technology can change humans personally and socially — and investigate ways to make sure that change is positive. The group has studied — and continue to study — a range of issues at the nexus of media, technology and human behavior, such as fake news, virtual healthcare assistants, the effects of technological personalization and human-robot interaction, to name a few.

Cooperative or creepy?

People tend to be conflicted about the use of machine learning and AI in their own lives, said Sundar. On the one hand, most people have grown accustomed to interacting with computers. On the other, a computer, which seems to be so intelligent that it displays signs of will, or agency, can make people feel uncomfortable. When privacy and personal data are involved, that discomfort can lead to wariness about AI for some people.

“We are increasingly giving away control to machines — and, in that respect, we should be figuring out strategies for making AI systems that will do things that do not seem humanly possible — like detect fake news — but, at the same time, we don’t want machines that will invade our privacy and take away our sense of humanness,” Sundar said.

To work so intelligently and provide personalized results for the user, AI systems need lots of data — and giving up that data can make people feel that their privacy is being threatened. Sundar suggests designers and developers will need to carefully balance usefulness and privacy when they create AI-enabled computing systems, which, to many people, appear to be so intelligent that they seem to have minds of their own.

“The psychology of machine agency is based on the fact that people think of AI as an interaction source, like a computer, only with more agency,” said Sundar. “When it comes to privacy, people tend to think that computers snoop too much, they tend to know too much about us. You need to build in those types of sensitivities to address this feeling of invasion of privacy in future systems.”

According to Sundar, AI systems that are more transparent in how they use data and offer people a better understanding of how the technology will manage the users’ permissions have a better chance of walking that fine line between helpful and intrusive.

“But, transparency is not always feasible because it can be used to game the algorithms by bad actors,” Sundar said.

In the future, smart machines may morph into smart environments — like smart homes — and people may find themselves not just dealing with a single AI-enabled device, or app, but actually surrounded by artificial intelligence. Sundar said that in those environments, the focus should be on building synergy between machines and humans. In this type of synergy, the user becomes part of the smart environment and the human and the AI are blended.

“The smart technologies of the future are the ones that will be more relational, so when you say you have a smart home, it doesn’t mean that just your home is going to be smart, but rather it means that you and your smart home will be integrated in some way,” said Sundar. “Both you and the home will be part of the same entity, the same living ecosystem. So, you are talking to your environment and your environment is talking to you in a seamless fashion.”

AI as machine helpers and extended minds

Sundar added that one of the ways artificial intelligence can be positively integrated into society is through its ability to extend — or augment — human capabilities. He referred to fake news detection as an example of how an AI system trained to sniff out false news stories could one day serve as a helpful guide for busy people to sort out the increasingly chaotic media landscape.

“There are so many aspects of a news story — sources, networks, linguistic features, just to name a few — for us to determine if that story is legitimate, or not,” said Sundar. “The human brain can only stack so much, even trained journalists can have only so many variables in their head when they are analyzing a story. But a machine can, within seconds, go through hundreds of stories and can check all of those variables.”

Eventually, if designers and developers can make AI systems that are helpful, but not creepy, people may begin to see AI as an extension of themselves, just as other pieces of technology have been incorporated into everyday life, according to Sundar.

“This raises the idea of mutual augmentation — AIs can augment human abilities in areas where, previously, we were limited, and humans can train algorithms to serve us better,” said Sundar. “Good AI systems serve as an extra mind, almost like an extra arm or extra leg. Just like the automobile extends our ability to travel, AI extends our ability to make decisions.”

Share

Related Posts