/cdn.vox-cdn.com/uploads/chorus_image/image/55225165/REC_ASA_CODE17_20170530_162020_0093.0.jpg)
The New York Times says it is going to expand the availability of online comments from 10 percent of articles to 80 percent by the end of the year, without adding more moderators to its staff.
How are they going to do this? With a machine-learning algorithm, of course.
The Times today is rolling out a new structure of comment moderation using software from Google called Perspective, developed by the company’s incubator, Jigsaw. The Moderator tool will automatically approve some comments and help moderators wade through others more quickly.
“What Moderator really is about is scale,” said Times community editor Bassey Etim, who oversees a core team of 14 moderators; he is project manager for the new content management system for the Times’ moderators, which uses Perspective. He said moderators won’t be replaced by the software, but that their jobs would be augmented.
This is just the first rollout in a longer process, expected to increase the percentage of articles with comments sections to 25 percent. Over time, the paper will redesign comments, and Bassey said writers will also start incorporating comments more into reporting.
“We do already write stories based on a lot of user-generated comment across the site,” he told Recode, adding, “There’s much more of a robust culture nowadays at the Times of really treating reader feedback as if it is part of the reporting.”
More comments means more feedback for reporters to comb through. It also means more engagement on the Times’ website, which translates into more digital advertising revenue.
The incorporation of Perspective — which Jigsaw partially trained on Times’ data — into the newspaper’s online comments section comes at an interesting time. The paper recently eliminated the public editor role, which traditionally dealt with reader feedback.
Jigsaw’s Perspective software categorizes comments by how “toxic” they are, which means how likely the comments are to make someone want to stop engaging in a conversation. You can play with the tool here to see the tool’s toxicity ratings for comments about topics like global warming and Brexit.
“It's become too easy for trolls to dominate conversations online. People are either leaving the conversation entirely or comments sections are being shut down,” said Jigsaw CEO Jared Cohen in a statement. “The power of machine learning offers us an opportunity to tip the scales and reverse this trend.”
Wikipedia, which also contributed data to Google to train Perspective, used the tool for a study of its discussion pages.
The Times is using the software to automatically approve a small percentage of comments on top homepage stories, and to highlight potentially toxic phrases so 14 moderators trained on the new software can more quickly determine whether to approve a comment or keep it out of view.
No comment will be automatically rejected by the algorithm; a human moderator will be required to make the final decision.
Comments that are likely to get booted, Etim said, are those that contain hate speech, are incoherent, are not relevant to the article, contain name-calling or fit other criteria for being toxic or counterproductive to conversation.
This article originally appeared on Recode.net.