The effects of news source credibility and fact-checker credibility on users' beliefs and intentions regarding online misinformation

Anol Bhattacherjee (School of Information Systems and Management, University of South Florida, Tampa, Florida, USA)

Journal of Electronic Business & Digital Economics

ISSN: 2754-4214

Article publication date: 27 October 2022

Issue publication date: 16 December 2022

1974

Abstract

Purpose

The purpose of this research is to evaluate the extent to which credibility of news sources and fact-checkers individually and jointly influence online users' beliefs and intended behaviors regarding online misinformation. The broader goal is to understand why fact-checking seems to have inconsistent effects on the beliefs and behavioral intentions about disinformation. 10;

Design/methodology/approach

An online experiment was conducted in a public health (COVID-19) context with 429 validated participants to test three hypotheses linking the main and interaction effects of two independent variables (news source credibility and fact-checker credibility) on three dependent variables (users' believability, reading intention and sharing intention of online news claims). The data was analyzed using multilevel (fixed effects) models controlling for individual differences, claim differences and order effects.

Findings

The author observed a nuanced pattern of effects; news source credibility had a positive main effect on believability but negative effects on reading and sharing intention; fact-checking credibility had a positive main effect on believability, but no effects on reading or sharing intentions, but negatively moderated the effects of source credibility on all three dependent variables.

Originality/value

This paper introduces, conceptualizes and tests whether a more credible fact-checker shapes the beliefs and intentions about online misinformation differently from less credible fact-checkers, especially when examined concurrently with similar effects of the original sources of misinformation claims. Additionally, it suggests that, on average, users have a low perception of credibility for fact-checkers (even reputed ones), which may explain why fact-checking is often ineffective in shaping the beliefs and intended behaviors.

Keywords

Citation

Bhattacherjee, A. (2022), "The effects of news source credibility and fact-checker credibility on users' beliefs and intentions regarding online misinformation", Journal of Electronic Business & Digital Economics, Vol. 1 No. 1/2, pp. 24-33. https://doi.org/10.1108/JEBDE-09-2022-0031

Publisher

:

Emerald Publishing Limited

Copyright © 2022, Anol Bhattacherjee

License

Published in Journal of Electronic Business & Digital Economics. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


1. Introduction

A global poll of over 25,000 online users in 25 countries reported that 86% of users were exposed to disinformation or misinformation on social media platforms, news websites, Youtube and television, and almost nine in ten users initially believed such disinformation or misinformation to be true (Simpson, 2019). Disinformation, also called fake news, refers to news known to be false that is spread intentionally to manipulate public opinion, while misinformation refers to false or out-of-context information that may be mistakenly believed to be true and spread without the intent to deceive or mislead (Hernon, 1995). To help us separate fact from fiction, news agencies, news aggregators and social media platforms have increasingly turned to fact-checkers to validate claims and counter-claims made by politicians, public health officials and others. Such fact-checkers include fact-checking units of major news organizations, such as Associated Press (AP) Fact Check from Associated Press, Fact Checker from Washington Post and Reality Check from British Broadcasting Service (BBC), as well as independent fact-checkers such as Snopes, Politifact and FactCheck. In 2019, the International Fact-Checking Network comprised of 188 fact-checking organizations spread across 60 countries (Stencel, 2019).

But does fact-checking indeed help combat disinformation? Prior research suggests mixed findings. While some studies reported that fact-checking correct falsely held beliefs (e.g. Pingree, Brossard, & McLeod., 2014; Weeks & Garrett, 2014), others found no such effect (e.g. Jarman, 2016; Moravec, Minas, & Dennis, 2019). A recent meta-analysis of 30 studies (Walter, Cohen, Holbert, & Morag, 2019) found that fact-checking has a small effect of on political beliefs (Cohen's d = 0.29), although prior meta-analyses found moderate to large effects (Chan, Jones, Hall, & Albarracín, 2017; Walter & Murphy, 2018).

Will fact-checking work better if it is conducted by highly credible fact-checkers? Will fact-checking have less of an impact if the original claim came from a credible news source? What might happen if a claim from a highly credible news source is discredited by a highly credible fact-checker? Or if claims made by a less credible news source is discredited by a less credible fact-checker. These are the research questions of interest to this study.

What complicates the matter further is that Gallup polls suggest that 60% of Americans do not trust the United States (US) mass media (Brenan, 2020). Similarly, Rasmussen Reports (2016) note that only 29% of US voters trust political fact-checking and 62% believe that news media “skew the facts” to help their preferred political candidates. Snopes, the world's largest fact checking organization, has been blamed for lack of editorial oversight of its fact checking staff and for disregarding standard journalistic and scientific procedures in its fact-checking process (Leetaru, 2016). Reputable fact-checkers, such as AP Fact Check, Politifact and Washington Post Fact Checker, have also been faulted for presenting personal opinions or economic assumptions as “incontrovertible facts,” for taking quotes out of context, for using fact checking to hide the truth, shield the powerful from accountability and slander people they did not like (Pareene, 2020; Sirota & Perez, 2021). Social media platform Facebook was blamed for pressurizing independent fact-checkers to change their ratings and for changing the label on certain videos to exempt them from fact-checking (Pasternack, 2020). Given these accounts, it remains unclear to what extent we trust news media and/or fact-checkers as arbiters of the veracity of news claims.

We explore our research questions by drawing on the source credibility literature from cognitive psychology to postulate three hypotheses regarding the main and interaction effects of news source credibility and fact-checker credibility, and then empirically testing these hypotheses using an online experiment involving fact-checking of coronavirus (COVID-19) news claims by manipulating their news sources (Reuters and Buzzfeed News) and fact-checkers (FactChecker.org and Hoax-Slayer.net) of varying credibility. Note that although there is some research on the credibility of news sources in the source credibility literature, to the best of our knowledge, there is no research yet on fact-checker credibility.

2. Hypotheses

Persuasion research (e.g. Chaiken, Liberman, & Eagly, 1989) suggests that when we lack the expertise or time to assess the veracity of uncertain information, we tend to rely on environmental cues, such as the credibility or trustworthiness of the information source, to form an opinion about that information. Source credibility is defined as the extent to which an information source is perceived to be believable, competent and trustworthy (Petty & Cacioppo, 1986) and is operationalized in terms of trustworthiness and expertise or competence (Pornpitakpan, 2004). Most empirical studies found source credibility to correct prior erroneous inferences and influence user behaviors; however this effect is not unanimous (Pornpitakpan, 2004).

Given that both news sources and fact-checkers are information sources (of claims and validity of claims respectively), the source credibility effect should apply to both news sources and fact-checkers. Hence, information coming from trusted sources and those validated by more credible fact-checkers should be more believable, and therefore more read and shared than those coming from than unknown or less trusted sources. Comparing user responses to 60 news sources, including mainstream media outlets, hyperpartisan websites and fake news websites, Pennycook and Rand (2019) found that online users have different levels of trust in these websites. However, it is not known if that trust differential influences user perceptions and behaviors. We are also not aware of any comparative analysis of fact-checkers, though it is reasonable to expect that we may also see some fact-checkers as being more credible than others, based on their history of operation, reputation or awards. Fact-checkers sometimes disagree on ambiguous claims that are not outright falsehoods or obvious truths, and in such cases, the more trusted and more credible fact-checking organizations should influence our beliefs and behavioral intentions more than less credible fact-checkers. Hence, we hypothesize:

H1.

News source credibility has positive effects on users' believability of and their reading and sharing intentions of online claims.

H2.

Fact-checker credibility has positive effects on users' believability of and their reading and sharing intentions of online claims.

How will fact-checker credibility influence claims coming from credible versus questionable news sources? Most of us prefer to source our news content from credible news sources because we trust news from such sources. Reputable news organizations recognize this and work hard to build public credibility by reporting accurate news to the extent possible. Our trust in high credibility news sources obviates the need for fact-checking, and hence, we may pay less attention to any fact-checks, irrespective of whether those fact-checks are coming from highly credible versus less credible fact-checkers and whether the fact-checking confirmed or disconfirmed the original claim. On the other hand, claims from less credible news sources are often viewed as of uncertain quality, resulting in weak attitude that is amenable to change. In such cases, fact-checking may change our perceptions and/or behaviors, and especially so, if the fact-check comes from a high-credibility fact-checker. Hence, we propose:

H3.

Fact-checker credibility have stronger positive effects on users' believability of and their reading and sharing intentions of online claims from less credible news sources than from more credible news sources.

3. Methods

The three hypotheses described above were empirically tested using an online experiment that employed a pretest-posttest, counterbalanced, repeated measures design with within-subjects treatments to control for both participant-level and claims-level variations. The experimental design is shown in Figure 1.

3.1 Participants

Study participants were recruited using Amazon Mechanical Turk (MTurk), restricted to the adult population in the US. Studies show that MTurk samples in the US are comparable to consumer panels’ samples (Steelman, Hammer, & Limayem, 2014). Moreover, previous fact-checking studies have also successfully used MTurk samples (e.g. Pennycook & Rand, 2019; Pennycook, Bear, Collins, & Rand, 2020). To ensure that our responses were of high quality, we excluded participants who failed to answer all four treatment manipulation check questions correctly and those who did not spend at least five minutes on the experimental task. The 5-min threshold was based on an initial pretest that showed that the minimum time needed to complete the study was at least 7 min. Our screening process resulted in a final sample size of 429 participants, with a median age of 44 and median educational level of “some level of college.” Participants took a median time of 12.4 min to complete the assigned task. Our sample size of 429 was more than adequate for this experiment, in light of the fact that a recent meta-analysis (Walter et al., 2019) found fact-checking to have a small effect on user beliefs (Cohen's d = 0.29), and a more conservative effect of 0.20 for within-subjects design require a minimum sample size of 199 (n = 351 if effect size is further reduced to 0.15).

3.2 Task and treatments

Participants were exposed to ten public health claims related to the ongoing COVID-19 (coronavirus) pandemic shown in Table 1. These claims were sourced from US Centers for Disease Control COVID-19 frequently asked questions (https://www.cdc.gov/coronavirus/2019-ncov/faq.html), worded in a neutral language to avoid biasing the study's participants and ambiguously stated to make it difficult for participants to guess whether they were true or false. The ten claims were randomly attributed to one of two news sources, Reuters and Buzzfeed News, as manipulations of high and low credibility news sources respectively and two fact-checking organizations, Factcheck.org and Hoax-Slayer.net, as manipulations of high and low credibility fact-checkers, plus a control group where claims were not fact-checked.

The choice of Reuters and Buzzfeed News for source manipulation was based on a 2017 survey of 28 news sources conducted by the Reynold Journalism Institute at the University of Missouri that found that Reuters was one of America's most trusted, nonpartisan news sources and Buzzfeed News as one of the least trusted (Kearney, 2017). Although there is no similar ranking for fact-checking organizations, FactCheck.org, a nonpartisan, nonprofit project of the Annenberg Public Policy Center of the University of Pennsylvania that has won numerous awards for journalistic integrity from the time magazine, the Society of Professional Journalists and the International Academy of Digital Arts and Sciences, is widely viewed as a highly credible fact-checker. In contrast, Hoax-Slayer.net, a one-person fact-checking operation run from a home-office in an outback town in Queensland, Australia, is relatively unknown to the US public can be seen as a less credible fact-checker.

The experiment employed a two-phase design, as shown in Figure 1. In the first phase, following informed consent, participants were provided with a brief background on the two news sources and two fact-checking sites, including their year of founding, number of employees, media awards and mode of operation, along with hypertext links to these sites for those who were unfamiliar with them. Participants were then asked four manipulation check questions to verify if they read the background information carefully. Those who did not answer all four questions correctly were removed from the sample. User perceptions of the credibility of each news source and fact-checker were then measured as independent variables. Participants were then exposed to ten COVID-19 claims in random order and asked whether they had previously seen these claims (prior exposure), their initial attitude toward each claim (positive or negative), and how important, relevant and interesting each claim was to people in their social network. They were also asked to rate the believability of each claim, and their intentions to read the full article making that claim and share that article on social media, on five-point semantic differential scales. In the second phase, the ten claims were attributed to one of the two sources (Reuters or Buzzfeed) and eight of the ten claims were randomly assigned a “verified” or “disputed” by one of two fact-checkers (FactCheck.org or Hoax-Slayer.org), while the remaining two claims were assigned an “unchecked” rating, as the experimental control. Taking into account the news sources and fact-checks, participants were asked to rate again their believability, reading intention and sharing intention of the ten claims.

3.3 Measurement of variables

The dependent variables were measured twice: (1) after participants' exposure to the claims but before exposure to news source or fact-checker treatments (pretreatment) and (2) after exposure to news source and fact-checker treatments (post-treatment). Believability was measured using a three-item scale adapted from Kim, Moravec, and Dennis (2019) that asked participants to rate how truthful, credible and believable they found each claim on five-point semantic differential scales. Cronbach alphas for this scale were 0.79 and 0.81 for the pretreatment and post-treatment measures, respectively. Reading and sharing intentions were measured using single-item measures that asked participants how likely they were to read the full online article and share it with their social media network using five-point semantic differential scales ranging from “extremely unlikely” to “extremely likely,” similar to Pennycook et al.'s (2020) sharing intention scale.

Independent variables (news sources and fact-checking credibility) were manipulated as treatments, but behavioral treatments are useful only to the extent that they are perceived as such by participants. Hence, credibility perceptions of each organization were measured using three semantic differential items that asked the extent to which participants considered each organization trustworthy, as having the necessary expertise to do its job, and their overall perceptions of the organization's credibility. This operationalization was based on prior research that postulates trustworthiness and expertise (competence) as the two dimensions of source credibility (Petty & Cacioppo, 1986; Pornpitakpan, 2004). Cronbach alpha for source and fact-checker credibility were 0.93 and 0.84, respectively.

Several additional variables were measured as control variables: whether participants had seen each claim prior to this experiment (prior exposure), their initial positive or negative reaction to each claim before exposure to source or fact-check rating (prior attitude), and their perceived importance, relevance and interestingness of each claim. Participants' online news reading and sharing frequencies (prior to this experiment) were also measured, along with demographics such as age, gender and education.

4. Results

4.1 Descriptive analysis

Mean news source credibility for Reuters and Buzzfeed News were 2.78 and 1.95, respectively on a five-point scale (see Figure 2), while mean fact-checker credibility for FactCheck.org and Hoax-Slayer.net were 1.99 and 1.73, respectively. These means confirm that the treatment manipulations worked as intended, and that participants found Reuters and FactCheck.org to be more credible than Buzzfeed News and Hoax-Slayer.net respectively, although the gap between FactCheck.org and Hoax-Slayer.net was quite narrow. Further, since all of these means were less than 3 (the neutral point on this scale), the typical participant in this experiment may have viewed all news sources and fact-checkers, including Reuters and FactCheck.org, with some level of distrust.

Among dependent variables, mean believability was 3.07 in the pretreatment phase, dropping to 2.89 in the post-treatment phase. Pretreatment mean reading and sharing intentions were 2.92 and 2.94 respectively, which remained practically unchanged in the post-treatment phase (2.91 and 2.96). While mean believability dropped following exposure to fact-checking, this change did not lead to a corresponding change in participants' behavioral intentions.

4.2 Hypotheses testing

Hypotheses were tested using multilevel, mixed-effects linear regression models with random intercepts. Three models were created corresponding to the three dependent variables: post-treatment believability, reading intention and sharing intention. In addition to the main and interaction effects of source credibility and fact-checker credibility, each model also included a range of control variables as shown in Table 2, and the fixed effects of participant ID, claim number, and claim order to control for individual differences, claim differences, and order effects respectively. Since the “unchecked” claims did not have fact-checker credibility, these claims were dropped from the analysis, resulting in a total of 3,432 observations from 429 participants.

Variance explained (R-squared) in these models ranged between 38% for sharing intention to 43% for believability, indicating reasonably good fit with the observed data. News source credibility had a significant positive main effect on post-treatment believability (β = 0.352, p < 0.001), while its effects on post-treatment reading (β = −0.179, p < 0.001) and sharing (β = −0.093, p < 0.01) intentions were negative and significant. Hence, Hypothesis H1 was supported for believability but not for reading and sharing intentions. Fact-checker credibility had a significant positive main effect on post-treatment believability (β = 0.080, p < 0.01), but its effects on post-treatment reading intention (β = 0.008, p > 0.05) and sharing intention (β = 0.046, p > 0.05) were positive but nonsignificant, providing weak support for Hypothesis H2. All of fact-checker effects were too small to be of interest. Lastly, the interaction between news source and fact-checker credibility was significant and negative for all three dependent variables: post-treatment believability (β = −0.122, p < 0.001), reading intention (β = −0.036, p < 0.05) and sharing intention (β = −0.044, p < 0.01), supporting Hypothesis H3.

5. Discussion and conclusions

5.1 Implications for research

This study revealed some interesting and potentially important findings. From the descriptive statistics, we see that, on average, both news media and fact-checkers are more distrusted than trusted. This includes even reputable news sources like Reuters and reputable fact-checkers like FactCheck.org. We also found fact-checkers to be less credible than news media, and the credibility gap between more reputable fact-checkers like FactCheck.org and less reputable fact-checkers like Hoax-Slayer.net is quite slim. These observations are concerning because if fact-checkers want to be seen as arbiters of truth, such low credibility ratings cannot certainly help their cause. Moreover, the narrow gap between FactCheck.org and Hoax-Slayer.net suggests that even the most reputable and recognizable fact-checkers have failed to inspire public trust in their operations.

Descriptive statistics also showed that participants' believability of news claims dropped after fact-checking, suggesting that rather than inspire confidence, the mere process of fact-checking may actually seed doubt in people's minds about the veracity of news claims, irrespective of whether the original claim is validated or refuted. Reading and sharing intentions remained practically unchanged from pretreatment to post-treatment stages, suggesting that even if fact-checking may influence our perceptions of online claims, it has no impact on our intentional behaviors.

Although the main effects of news source and fact-checking credibility cannot be interpreted given the presence of interaction effects, these main effects are nevertheless insightful. Our findings suggest that news source credibility influences our believability of online claims but has opposite (negative) effects on our reading and sharing intentions. Perhaps, we read and share claims from less credible sources because we are more curious about these claims. The main effects of fact-checker credibility on all three dependent variables were too small to be of interest. This nonsignificance may be the result of large standard errors relative to effect sizes or may suggest that fact-checking has no direct effect on user beliefs or behaviors, but rather influences the dependent variables only via interaction with news source credibility.

The negative interaction effects of source and fact-checker credibility on user believability and intentions indicate that the marginal effect of fact-checking diminishes with increasing source credibility. In other words, fact-checking is beneficial only when the original news source has questionable credibility. Hence, rather than fact-checking every claim, perhaps fact-checkers should only examine claims from dubious news sources.

Lastly, fact-checking credibility had consistent effects on user beliefs, but not on intentions. Although psychology research views beliefs to influence intentions, we see that this relationship falls apart in the context of disinformation fact-checking. This belief-intention inconsistency calls into question the relevance of classic psychology theories in today's post-truth world and may be a fruitful avenue for future research.

5.2 Implications for practice

The results suggest that our blind faith in fact-checking, especially in highly credible fact-checkers like AP FactCheck or FactCheck.org, may be misguided. This does not imply that we should not invest in fact-checking, but perhaps we should focus our fact-checking efforts on claims from questionable news sources, in order to extract the most benefit from fact-checking.

Second, while we have media rankings and ratings of news sources to guide our news consumption behaviors, we do not have any such rankings or ratings of fact-checkers to guide our acceptance of news claims. Such ranking or rating of fact-checking organizations may help communicate fact-checker credibility to the public, and thereby help leverage the fact-checker credibility effects observed in this study.

In conclusion, the relative inefficacy of fact-checking, in the face of growing news claims of questionable veracity, should be a warning sign for all of us. Our general lack of trust in news sources and fact-checkers is a major challenge for our societal harmony. Current efforts by social media companies to block certain claims and certain sources that presumably spread disinformation may still fall well short in our fight against disinformation, given our limited trust in the fact-checking process, if they do not consider a more nuanced approach on how to enhance public trust in the news media and the fact-checking process.

Figures

Experimental design

Figure 1

Experimental design

Descriptive statistics

Figure 2

Descriptive statistics

COVID-19 claims used in study

  1. 1.

    Contact tracing can reduce the spread of COVID-19

  2. 2.

    Centers for Disease Control and Prevention (CDC) recommends wearing two masks for adequate protection against COVID-19

  3. 3.

    COVID-19 virus has been detected in human feces and wastewater

  4. 4.

    A cheap, widely available drug called dexamethasone provides effective COVID-19 relief among severely sick patients

  5. 5.

    Contact lens disinfecting solution can kill the COIVD-19 virus on contact

  6. 6.

    Vaccines may not prevent people from contracting COVID-19 - Dr. Fauci

  7. 7.

    People who have recovered from COVID-19 have acquired immunity to the disease

  8. 8.

    Doctors say that children are at lower risk of contracting COVID-19 than adults

  9. 9.

    World Health Organization (WHO): People with autoimmune and other serious diseases should avoid COVID-19 vaccines for now

  10. 10.

    The drug remdesivir is known to reduce deaths among COVID-19 patients

Regression estimates

BelievabilityReading intentionSharing intention
PriorNewsExposure0.037* (0.02)−0.059** (0.03)−0.054* (0.03)
NewsInterestingness0.052*** (0.02)0.044** (0.02)
NewsRelevance0.064*** (0.02)0.084*** (0.02)
NewsImportance0.011 (0.02)−0.0001 (0.02)
PreAttitude0.021* (0.01)−0.136*** (0.01)0.011 (0.01)
PreBelievability0.211*** (0.02)
PreReadingIntention0.132*** (0.02)
PreSharingIntention0.154*** (0.02)
Post-Believability0.632*** (0.03)0.664*** (0.03)
NewsSourceCredibility0.352*** (0.03)−0.179*** (0.04)−0.093** (0.04)
FCCredibility0.080** (0.05)0.008 (0.05)0.046 (0.05)
NewsSrcCred*FCCred−0.122*** (0.01)−0.036* (0.02)−0.044** (0.02)
FactCheckValue0.396*** (0.02)−0.064** (0.03)−0.043 (0.03)
Intercept1.457*** (0.21)1.598*** (0.31)0.535 (0.31)
R-squared0.4270.3850.376
Adjusted R-squared0.3400.2910.280
F-statistic4.895*** (df = 453; 2,978)4.077*** (df = 457; 2,974)3.922*** (df = 457; 2,974)

Note(s): *p < 0.05, **p < 0.01, ***p < 0.001; Standard errors in parenthesis

Fixed effects not shown to conserve space

References

Brenan, M. (2020). Americans remain distrustful of mass media.Washington, DC: Gallup. available from: https://news.gallup.com/poll/321116/americans-remain-distrustful-mass-media.aspx

Chaiken, S., Liberman, A., & Eagly, A. H. (1989). Heuristic and systematic information processing within and beyond the persuasion context. In Uleman, J.S., & Bargh, J.A. (Eds.), Unintended Thought (pp. 212252). New York: Guilford Press.

Chan, M. S., Jones, C. R., Hall, J. K., & Albarracín, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, 28, 15311546.

Hernon, P. (1995). Disinformation and misinformation through the Internet: Findings of an exploratory study. Government Information Quarterly, 12, 133139.

Jarman, J. W. (2016). Influence of political affiliation and criticism on the effectiveness of political fact checking. Communication Research Reports, 33, 915.

Kearney, M. W. (2017). Trusting news project report. Reynolds Journalism Institute. available from: https://www.rjionline.org/reporthtml.html

Kim, A., Moravec, P. L., & Dennis, A. R. (2019). Combating misinformation on social media with source ratings: The effects of user and expert reputation ratings. Journal of Management Information Systems, 36, 931968.

Leetaru, K. (2016). The daily mail Snopes story and fact checking the fact checkers. Forbes. available from: https://www.forbes.com/sites/kalevleetaru/2016/12/22/the-daily-mail-snopes-story-and-fact-checking-the-fact-checkers/?sh=77d28074227f

Moravec, P. L., Minas, R. K., & Dennis, A. R. (2019). Fake news on social media: People believe what they want to believe when it makes no sense at all. MIS Quarterly, 43, 13431360.

Pareene, A. (2020). How political fact-checkers distort the truth. New Republic. available from: https://newrepublic.com/article/156039/political-fact-checkers-distort-truth

Pasternack, A. (2020). Facebook is quietly pressuring its independent fact-checkers to change their rulings. Fast Company. available from: https://www.fastcompany.com/90538655/facebook-is-quietly-pressuring-its-independent-fact-checkers-to-change-their-rulings

Pennycook, G., & Rand, D. G. (2019). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the national academy of sciences (Vol. 116, pp. 25212526).

Pennycook, G., Bear, A., Collins, E. T., & Rand, D. G. (2020). The implied truth effect: Attaching warnings to a subset of fake news headlines increases perceived accuracy of headlines without warnings. Management Science, 66, 49444957.

Petty, R. E., & Cacioppo, J. T. (1986). Communication and persuasion: Central and peripheral routes of attitude change. New York: Springer-Verlag.

Pingree, R. J., Brossard, D., & McLeod, D. M. (2014). Effects of journalistic adjudication on factual beliefs, news evaluations, information seeking, and epistemic political efficacy. Mass Communications & Society, 17, 615638.

Pornpitakpan, C. (2004). The persuasiveness of source credibility: A critical review of five decades' evidence. Journal of Applied Social Psychology, 34, 243281.

Rasmussen Reports (2016). Voters don't trust media fact-checking. available from: https://www. rasmussenreports.com/public_content/politics/general_politics/september_2016/voters_don_t_trust_media_fact_checking

Simpson, S. (2019). Fake news: a global epidemic vast majority (86%) of online global citizens have been exposed to it. Ipsos Public Affairs. available from: https://www.ipsos.com/en-us/news-polls/cigi-fake-news-global-epidemic

Sirota, D., & Perez, A. (2021). Lies, damn lies, and fact-checking. Daily Poster. available from: https://www.dailyposter.com/p/lies-damn-lies-and-fact-checking

Steelman, Z. R., Hammer, B., & Limayem, M. (2014). Data collection in the digital age: Innovative alternatives to student samples. MIS Quarterly, 38, 355378.

Stencel, M. (2019). Number of fact-checking outlets surges to 188 in more than 60 countries. The Poynter Institute: St. Petersburg, FL, June 11. available from: https://www.poynter.org/fact-checking/2019/number-of-fact-checking-outlets-surges-to-188-in-more-than-60-countries/

Walter, N., & Murphy, S. T. (2018). How to unring the bell: A meta-analytic approach to correction of misinformation. Communication Monographs, 85, 423441.

Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2019). Fact-checking: A meta-analysis of what works and for whom. Political Communications, 37, 350375.

Weeks, B. E., & Garrett, R. K. (2014). Electoral consequences of political rumors: Motivated reasoning, candidate rumors, and vote choice during the U.S. presidential election. International Journal of Public Operations Research, 26, 401422.

Corresponding author

Anol Bhattacherjee can be contacted at: abhatt@usf.edu

Related articles