News & Events

ICDS News

Health Comm

People may be wary of health information and advice on crowdsourced sites -- even when there are indications that medical experts and professionals have reviewed the information. IMAGE: WIKIMEDIA COMMONS

Healthy skepticism: People may be wary of health articles on crowdsourced sites

Posted on October 15, 2020

UNIVERSITY PARK, Pa. — People may be skeptical about medical and health articles they encounter on crowdsourced websites, such as Wikipedia and Wikihealth, according to researchers. While that may be good news for health officials who are worried that these sites allow non-experts to easily add and edit health information, the researchers added that having medical professionals curate content on those sites may not reduce the skepticism.

“There are major concerns about health misinformation that’s floating around, especially now with COVID-19,” said S. Shyam Sundar, James P. Jimirro Professor of Media Effects, co-director of the Media Effects Research Laboratory and affiliate of Penn State’s Institute for Computational and Data Science (ICDS). “Now that anybody and everybody can generate health-related posts, it is natural to be concerned that information on these crowdsourced websites might influence people. Our study suggests that health practitioners need not get too worked up about these sites. Laypersons, like the participants in our study, do not trust the crowd, nor do they think that the information they provide is comprehensive.”

In prior research, scientists have found that people tend to trust online content that appears popular with the crowd, such as posts that earn likes and retweets, a phenomenon that is also referred to as the bandwagon effect, said Yan Huang, assistant professor of integrated strategic communication, University of Houston, lead-author of the study, who worked with Sundar. However, Huang said that the findings suggest that, as far as trust goes, people draw a line between content that has been endorsed by the crowd and content that has been edited by the crowd.

“The effect of crowdsourcing may be different from the bandwagon effect because in this case, the crowd is not just liking or recommending existing content; the crowd is actually generating it,” said Huang. “And it seems that participants in our experiment were able to make that psychological distinction between the bandwagon effect — when people are just endorsing the content — versus people who are creating the content. Overall, participants didn’t trust the content that’s been collaboratively created by other people.”

The researchers, who report their findings in Health Communication, added that their experiment also revealed another side of the story: When participants noticed interactive features on the site that allowed them to edit the content, they trusted crowdsourced health articles more.

“When they realize that they could serve as the editor or author of the content, they are imbued with a sense of control on the site. The more aware they are of this interactivity, the more they trust the content,” said Huang.

According to Sundar, the findings suggest that while people trust themselves as a source of credible content, they may be reluctant to extend that trust to other people.

“When you are the source, you think that the information is credible because you can add content, but if other people can also supply content, that seems to take away the credibility,” said Sundar.

Because crowdsourced content could be manipulated to include non-scientific advice or results, many health experts may think that credentialing their crowdsourced posts may add credibility to their articles. However, the researchers found that when content was curated, or even created by an expert, such as a doctor, participants did not find the information any more credible to the reader. To signal that the source was a professional, the researchers added a doctor symbol and the name and title of a doctor on the experimental website.

“Adding that professional source layer did increase the perception that there was a gatekeeper behind the content, in other words, people did think that the crowdsourced site had more gatekeeping when there were indications of a professional source, but it did not affect credibility judgments,” said Sundar.

The researchers added that because the findings show a lack of trust in health content on crowdsourced sites, webmasters of these sites may want to make sure that interactive features are added and that they are prominent in the design. Interactive features tend to boost users’ positive perceptions of the source and content, which could lead to better credibility on crowdsourced sites, they said.

The researchers created eight different versions of a website called “Healthpedia” to test the various conditions of the experiment. The site mimicked the interface of common crowdsourcing websites, such as Wikipedia and WikiHow, for testing the effects of crowdsourcing. While some versions identified a doctor as content editor, others offered editing functions and showed that the content was collaboratively created by over 45 individuals.

The study was conducted with a sample of 192 participants recruited from Amazon Mechanical Turk. They were exposed to articles that featured either the negative effects of sunscreen products or the potential risks of pasteurized milk. The researchers said that the content was intentionally made controversial to better test the credibility of the crowd as a source. According to the researchers, because the health articles were somewhat outlandish, future work may test whether the perception holds for fairly routine health content.

Share

Related Posts