The computer model seeks to explain the spread of disinformation and suggest countermeasures

It starts with a super spreader and weaves its way through a web of interactions, ultimately leaving no one unscathed. Those who have been exposed previously may feel little effect when exposed to a different variant.

No, it is not a virus. It is the contagious spread of disinformation and disinformation, disinformation intended entirely to deceive.

Now researchers at Tufts University have come up with a computer model that remarkably reflects how disinformation spreads in real life. The work could provide insight into how to protect people from the current contagion of disinformation that threatens public health and the health of a democracy, the researchers say.

“Our society is grappling with widespread beliefs in conspiracies, growing political polarization and a distrust of scientific findings,” said Nicholas Rabb, a PhD. computer science student at Tufts School of Engineering and lead author of the study, published Jan. 7 in the journal ONE Public Science Library. “This model could help us understand how disinformation and conspiracy theories are propagated, to help find strategies to counter them.”

Scientists who study the dissemination of information often take a page from epidemiologists, modeling the spread of false beliefs about how a disease spreads through a social network. Most of these models, however, treat the people in the networks as all taking into account any new beliefs conveyed to them through the contacts equally.

Rather, the Tufts researchers based their model on the idea that our pre-existing beliefs can strongly influence our acceptance of new information. Many people reject factual information supported by evidence if it takes them too far from what they already believe. Health workers have commented on the strength of this effect, observing that some patients dying from COVID cling to the belief that COVID does not exist.

To take this into account in their model, the researchers assigned a “belief” to each individual in the artificial social network. To do this, the researchers represented the beliefs of individuals in the computer model as a number from 0 to 6, with 0 representing strong disbelief and 6 representing strong belief. The numbers could represent the range of beliefs on any issue.

For example, one might think of the number 0 representing the strong disbelief that COVID vaccines are useful and safe, while the number 6 could be the strong belief that COVID vaccines are in fact safe and effective.

The model then creates a vast network of virtual individuals, as well as virtual institutional sources that are the source of much of the information that passes through the network. In real life, this could be the news media, churches, governments, and social media influencers, basically the news super-broadcasters.

The model starts with an institutional source that feeds the information into the network. If an individual receives information close to their beliefs (for example, a 5 compared to their current 6), they have a higher probability of updating that belief to 5. If the incoming information differs significantly from their current beliefs, say a 2 over a 6 – they’ll likely reject it altogether and hold onto their belief at 6 levels.

Other factors, such as the proportion of their contacts who send them the information (essentially, peer pressure) or the level of trust in the source, can influence how individuals update their beliefs. A population-wide network model of these interactions then provides an active view of the spread and persistence of disinformation.

Future improvements to the model will take into account new knowledge from both network science and psychology, as well as a comparison of model results with real-world opinion polls and network structures over time. time.

While the current model suggests that beliefs can only change gradually, other scenarios could be modeled which cause a larger change in beliefs – for example, a jump from 3 to 6 which could occur when a dramatic event occurs. happens to an influencer and begs his followers to change their mind.

Over time, the computer model can become more complex to accurately reflect what is happening on the ground, say the researchers, who, in addition to Rabb, include his educational advisor Lenore Cowen, a computer science professor; computer scientist Matthias Scheutz; and JP deRuiter, professor of psychology and computer science.

“It is becoming too clear that the mere dissemination of factual information may not be enough to impact the state of mind of the public, especially among those who are locked into a belief system that is not based on facts. Cowen said. “Our initial effort to incorporate this idea into our models of the mechanics of disinformation disseminated in society can teach us how to reduce public conversation to facts and evidence.”

Comments are closed.