In polarized societies, divided subgroups of people have different perspectives on a range of topics. Aiming to reduce polarization, authorities may use debunking to lend support to one perspective over another. Debunking by authorities gives all observers shared information, which could reduce disagreement. In practice, however, debunking may have no effect or could even contribute to further polarization of beliefs. We developed a cognitively inspired model of observers rational inferences from an authoritys debunking. After observing each debunking attempt, simulated observers simultaneously update their beliefs about the perspective underlying the debunked claims and about the authoritys motives, using an intuitive causal model of the authoritys decision-making process. We varied the observers prior beliefs and uncertainty systematically. Simulations generated a range of outcomes, from belief convergence (less common) to persistent divergence (more common). In many simulations, observers who initially held shared beliefs about the authority later acquired polarized beliefs about the authoritys biases and commitment to truth. These polarized beliefs constrained the authoritys influence on new topics, making it possible for belief polarization to spread. We discuss the implications of the model with respect to beliefs about elections.