Robo-reporting just another tool for journalists

The Quakebot algorithm wrote a similar article about many earthquakes, including one April 3. This map indicates the distance from the epicentre to Los Angeles. The Quakebot algorithm wrote a similar article about many earthquakes, including one April 3. This map indicates the distance from the epicentre to Los Angeles.

Vick Karunakaran
Biz/Tech Reporter

On March 17, a brief article appeared on the Los Angeles Times website shortly after an earthquake woke up the city. It laid out the bare facts of the quake but the last line in the story may cause journalists to lose more than their sleep.

“This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author,” it said.

A computer program generated a news story soon after the earthquake. It was programmed to do so by its creator, journalist Ken Schwencke, in the event of an earthquake notification.

The algorithm used the new data and placed it into a pre-written template, said Schwencke, in an interview with Slate magazine. The story was uploaded within about three minutes of the quake.

While the earthquake algorithm made a lot of news, it was not the first time, said Mike Karapita, program coordinator for Humber’s accelerated journalism program. He said algorithms have been used to generate news before this story.

Karapita said routine news briefs tend to have a predictability and sameness about them and he wouldn’t be too concerned if those stories used algorithm. He said he wouldn’t worry about basic news as much as worry if a computer program wrote big news.

“Journalists are the eyes and ears of the community,” said Paul Morse, president of Unifor Local 87-M. It is the ability to provide their audience content that is most meaningful for them.

“I don’t think a computer can do that,” he said.

Local 87‐M represents 2,600 media employees in Southern Ontario, including employees at the Globe and Mail, Toronto Star, Metroland and the Hamilton Spectator.

Journalists have used computer-assisted reporting for quite a number of years, said Morse. One of the most effective and high profile examples for that was the Code Red project by the Hamilton Spectator, he said.

The Code Red series of articles used massive amounts of data to show how socio-economic factors had an impact on people’s health, said Morse.

“Is a computer going to figure that on its own? Probably not,” he said. Journalists require the skills to use sophisticated computer programs to mine the data.

According to Karapita, there are two competing arguments here where one says this will put journalists out of work. The other side suggests “using algorithms to take care of basic news frees up journalists more advanced, complex feature pieces,” he said.

“The more that you rely on these systems blindly, you can end up…not being vigilant about your content and your style,” said Karapita. He said he’s not convinced algorithms are good for anything beyond basic, short news items.

A Swedish report published last month showed a study investigating “how readers perceive software-generated content in relation to similar content written by a journalist.” Assistant Professor Christer Clerwall of Karistad University used a sample of 46 undergraduates to focus on the text or message of the news content, it said.

“Some aspects of quality, such as being clear and being pleasant to read, received a slightly higher score for human-written content,” the report said. However, other aspects like if the content was informative, trustworthy or objective were higher for automated content.

“The lack of difference may be seen as an indicator that the software is doing a good job, or it may indicate that the journalist is doing a poor job,” said Clerwall in his report.

Those stories are checked by human eyes although they were generated by an algorithm, said Carey French, program coordinator for Journalism at Humber’s North campus.

“I have no problem with that,” he said.

“My concern would be the day the editor starts to trust the system,” said French.

Meanwhile, technology company Narrative Science recently launched a free app that translates data from Google Analytics into narrative reports. The application mines the source data and generates reports in “in plain English,” the release said.

“A computer is going to generate simply what it’s been told to generate,” said Morse.

It’s just doing something by rote without thinking about the implication or impact on the audience, he said.

“Journalism has to find a way of doing more with less,” said French. “We cannot get away from the fact that we need to find ways of doing some of the things we do less expensively,” he said.

Karapita said journalism students should be fully informed about all the systems and trends out there so that when they get to the work world, they are not blind-sided.

“I’m a dyed-in-the-wool Luddite,” said French. “But I use every available piece of technology because if I don’t, somebody else will.”

A similar article created by Schwencke's algorithm.

A similar article created by Schwencke’s algorithm.