How to deal with misinformation in online communities

Misinformation is rampant on social platforms large and small. Even LinkedIn, a network of professionals, is no longer immune to discussions that stretch the definition of truth.

Inevitably, this will likely trickle down to the platforms you manage, even private communities.

So what should a community manager do?

Cecilia Sepp, nonprofit expert and founder of Rogue Tulips, LLC, pondered this question, write a blog post on this topic at the end of last year. She suggested that community guidelines can help lay the groundwork that can help keep a discussion from going off the rails.

“Our Community Guidelines may need to be reviewed as we need a starting point to refer to what should and shouldn’t be picked up,” Sepp said. “And a professional forum is not a place where you should discuss or write about political issues on a personal level.”

While an association’s forum may be an appropriate space to discuss a policy position it has taken, it is not carte blanche to discuss policy issues in general, she added.

“If you’re in a community of your peers within a profession, trade or industry, you need to be aware that it’s a different level of conversation,” she said. .

What is the truth, anyway?

Thanks to the rise of misinformation, community managers are struggling to manage how people discuss topics in professional forums. A “factual” claim without supporting sources – or sources that are in themselves flawed – can directly misinform other members of the community if you are not careful, a problem that long predates the rise of ‘Internet. Sepp cited urban legends, such as the widely described myth that Coca-Cola removes rust, as an example of this.

When actors in a conversation, online or otherwise, come in with an agenda, it may not lead to information per se, but it can create a climate where misinformation is used.

“A lot of bad behavior starts with people trying to impose their personal beliefs on others without having people’s permission to discuss it,” she said. “And I think a lot of [happens] as we try to be more aware of people in the world and how sensitive people are to different issues.

Even if something appears to be factual, if it’s poorly researched or misrepresented, it can amount to misinformation. Sepp pointed out how scientific data can be more fluid than it seems.

“We like to think the science is very clear,” she said, “but there’s a lot of science that really isn’t.”

The health sciences are a good example of this, as evidenced by the often complex discussions around COVID-19in which so little is understood about the disease that theories are often presented as fact, and information can be hand-picked to support either argument.

One approach she suggested for navigating this complex environment around truth might be to tell users, when sharing an opinion, to make it clear that they are offering a point of view, rather than talking about facts.

“If we could train people to do this, boy, would the world be a better place,” she said.

The role moderators should play

Sepp pointed out that while moderators may have to play an active role in helping to manage discussions in which misinformation may be shared, ultimately their role is not that of truth police, checking every piece of information shared by users.

Instead, it’s as an objective observer, trying to manage a discussion to ensure it stays positive for the community as a whole, organizing the discussion so that it steers away from misinformation. (Some virtual event moderation tips are still useful even for online community managers.) Sepp suggested that community managers have a lot in common with journalists and might want to approach their roles as such. This applies not only to moderation but to community building; she suggested that organizations write guidelines that reflect journalistic integrity.

But beyond how they manage what appears in a forum, community managers should still feel empowered to make a call when necessary.

“I think the next step after guidance is to train community managers who are empowered and confident enough to moderate posts until more research has been done on this,” she said.

Sepp suggested that one way community managers can play this role is to develop an instinct to make a call when a discussion has run its course.

“It’s OK to end the conversation,” she said. “It’s okay to end the discussion, because it elevates that experience [when you don’t] let someone open that Pandora’s box again and again.

(A_M_Radul/iStock/Getty Images Plus)

Comments are closed.