Source: Disqus.com blog
Pseudonymous commenters not only contribute more, but also contribute more quality comments, according to Disqus’ analysis of the comments flowing through its system.
But this concept of “quality comments” is problematic. Looking at the research method, we see that positive comments are coded based on how many likes or replies a comment receives; negative signals include the number of times a comment is flagged, marked as spam, or deleted. This leaves a lot of daylight between positive and negative comments which manifests itself as a highly ambiguous zone designated as “neutral” comments. Surely these neutral comments, which represent a strong majority of anonymous comments, are not comprised of purely uncontroversial or unexceptional content. Similarly, I’m sure a lot of otherwise constructive or valuable comments often get flagged or voted down.
This is not to sling mud at Mr. Ha or Disqus. I happen to agree with the underlying premise of this study, which is that we need to get past this black-and-white conflict between real identities or pure anonymity — but to draw attention to a trap that commenter researchers like myself should be careful to avoid: the notion that there is some consistent way of coding positive and negative comments across different commenting communities. In reality, each commenting community tends to regulate its discourse according to its own values and norms, which entails different and evolving notions of positive or negative discourse. Naturally, this ultimately privileges and excludes certain discourses. Consequently, the question should not be how to encourage positive comments across the board, but how site managers in a particular community can do so.
To bring this back to our own ongoing research here at the Digital News Test Kitchen, before we are to understand the proper course of action for a site like the Greeley Tribune (which last year shut down its user-comment feature, along with other newspaper websites in the Swift Communications chain), we need to understand the particular social relations at play in the comment section pre-shutdown. We believe that a critical discourse analysis will be a valuable frame for such research, but more on that later.
I just wrote a thesis in political science on the quality of comments in newspaper website. In this regard "quality" was determined in relation to theories of deliberative democracy. In short: how valuable are the comments in a democratic discussion. I measured each comment on level of mutuality (dialogue or not), tone (hatred, sarcasm, indifference, factual, praise), level of respect towards those adressed and those talked about and finally several indicators on the quality of the argument if at all there was such. That constitutes a measure of quality of comments. The project was intended to explore whether proactive moderation increased the quality and it did to some extent. I acted as a moderator for three weeks at a danish daily newspaper and measured the data on an equivalent set of data. Results were positive on all indicators but most significant on the amount and level arguments. A very positive indirect result was that whenever people were actually addressing each other directly and engaging in conversation with arguments (claim+evidence) the tone and level of respect was much higher. So in order to achieve the quality I'm talking about you need to stimulate direct conversation between readers and ask the right questions or set a specific goal for the discussion that drives people to argue their case instead of just throwing comments into the open.
Hope is was useful to you