News site tests readers understand story before letting them comment
People trying to comment on articles will now be forced to prove they understand what it's about.
That's at least at Norwegian broadcaster NRK's website, which will present people who want to leave comments with a quiz that asks them about what the story is actually about.
The creators of the quiz hope that asking people the questions will make sure that everyone on the comment actually understands it.
And they hope too that forcing people to take 15 seconds to answer it will give them time to calm down. That should stop people leaving quite so many angry comments under stories, they hope.
Stories that use the quiz will see people presented with three short multiple choice questions that they must give the correct response to. So, for instance, people might have to correctly answer what an acronym that is used in the story stands for, for instance.
The creators of the tool – which for the moment is in use on NKRbeta, the broadcasters' tech page – hope that it will ensure that everyone knows what they're talking about before they actually start talking about it.
"We thought we should do our part to try and make sure that people are on the same page before they comment," journalist Ståle Grut told Nieman Lab, which first reported the news. If everyone can agree that this is what the article says, then they have a much better basis for commenting on it.”
The extra time should also force people to think a little more deeply about what they are commenting on.
“If you spend 15 seconds on it, those are maybe 15 seconds that take the edge off the rant mode when people are commenting,” said Marius Arnesen, editor of NKRbeta.
Most of NKRbeta has developed a dedicated following who talk positively in the comments on its articles. But occasionally its stories are published on the homepage or elsewhere – which brings a whole new set of often angry readers, who tend to undermine the tone in the comments, said NKRbeta journalists.
Many sites have got rid of comments altogether, arguing that they are becoming hard to maintain, tend to become negative quickly and that people can have their own conversations on Facebook or elsewhere.
Other websites have attempted to tackle the problem with different technologies. Google announced last week, for instance, that it had built an artificially intelligent tool that can read through comments and identify whether or not they are "toxic".
Independent News Service